Sep 30 19:32:25 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 19:32:25 crc restorecon[4552]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:25 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 19:32:26 crc restorecon[4552]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 19:32:27 crc kubenswrapper[4553]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 19:32:27 crc kubenswrapper[4553]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 19:32:27 crc kubenswrapper[4553]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 19:32:27 crc kubenswrapper[4553]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 19:32:27 crc kubenswrapper[4553]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 19:32:27 crc kubenswrapper[4553]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.237407 4553 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245600 4553 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245638 4553 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245647 4553 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245658 4553 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245667 4553 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245676 4553 feature_gate.go:330] unrecognized feature gate: Example Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245686 4553 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245697 4553 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245706 4553 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245716 4553 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245727 4553 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245742 4553 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245756 4553 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245767 4553 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245779 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245802 4553 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245812 4553 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245820 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245833 4553 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245845 4553 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245853 4553 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245861 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245869 4553 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245877 4553 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245884 4553 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245893 4553 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245901 4553 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245909 4553 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245917 4553 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245926 4553 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245936 4553 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245946 4553 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245955 4553 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245964 4553 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245973 4553 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245982 4553 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.245990 4553 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246000 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246008 4553 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246017 4553 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246025 4553 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246033 4553 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246072 4553 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246080 4553 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246088 4553 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246096 4553 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246105 4553 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246113 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246121 4553 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246129 4553 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246137 4553 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246148 4553 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246159 4553 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246168 4553 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246178 4553 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246187 4553 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246199 4553 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246211 4553 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246225 4553 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246234 4553 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246243 4553 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246252 4553 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246261 4553 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246269 4553 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246278 4553 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246286 4553 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246294 4553 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246304 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246312 4553 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246320 4553 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.246327 4553 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246480 4553 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246498 4553 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246516 4553 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246529 4553 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246542 4553 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246551 4553 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246565 4553 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246584 4553 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246595 4553 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246606 4553 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246616 4553 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246627 4553 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246637 4553 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246646 4553 flags.go:64] FLAG: --cgroup-root="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246655 4553 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246664 4553 flags.go:64] FLAG: --client-ca-file="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246674 4553 flags.go:64] FLAG: --cloud-config="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246683 4553 flags.go:64] FLAG: --cloud-provider="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246692 4553 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246703 4553 flags.go:64] FLAG: --cluster-domain="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246712 4553 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246722 4553 flags.go:64] FLAG: --config-dir="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246731 4553 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246741 4553 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246753 4553 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246762 4553 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246771 4553 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246781 4553 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246790 4553 flags.go:64] FLAG: --contention-profiling="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246798 4553 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246807 4553 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246817 4553 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246829 4553 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246841 4553 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246850 4553 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246859 4553 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246868 4553 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246877 4553 flags.go:64] FLAG: --enable-server="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246886 4553 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246899 4553 flags.go:64] FLAG: --event-burst="100" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246908 4553 flags.go:64] FLAG: --event-qps="50" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246918 4553 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246927 4553 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246937 4553 flags.go:64] FLAG: --eviction-hard="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246948 4553 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246957 4553 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246967 4553 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246977 4553 flags.go:64] FLAG: --eviction-soft="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246987 4553 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.246995 4553 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247007 4553 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247016 4553 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247026 4553 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247064 4553 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247075 4553 flags.go:64] FLAG: --feature-gates="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247087 4553 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247097 4553 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247109 4553 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247118 4553 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247128 4553 flags.go:64] FLAG: --healthz-port="10248" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247138 4553 flags.go:64] FLAG: --help="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247147 4553 flags.go:64] FLAG: --hostname-override="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247156 4553 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247166 4553 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247177 4553 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247186 4553 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247195 4553 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247204 4553 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247215 4553 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247225 4553 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247234 4553 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247243 4553 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247252 4553 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247261 4553 flags.go:64] FLAG: --kube-reserved="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247270 4553 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247279 4553 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247288 4553 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247297 4553 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247306 4553 flags.go:64] FLAG: --lock-file="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247315 4553 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247325 4553 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247335 4553 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247348 4553 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247357 4553 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247367 4553 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247376 4553 flags.go:64] FLAG: --logging-format="text" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247387 4553 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247398 4553 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247408 4553 flags.go:64] FLAG: --manifest-url="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247419 4553 flags.go:64] FLAG: --manifest-url-header="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247431 4553 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247440 4553 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247453 4553 flags.go:64] FLAG: --max-pods="110" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247464 4553 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247475 4553 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247486 4553 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247495 4553 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247504 4553 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247514 4553 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247526 4553 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247550 4553 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247560 4553 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247570 4553 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247579 4553 flags.go:64] FLAG: --pod-cidr="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247590 4553 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247606 4553 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247615 4553 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247624 4553 flags.go:64] FLAG: --pods-per-core="0" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247633 4553 flags.go:64] FLAG: --port="10250" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247643 4553 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247652 4553 flags.go:64] FLAG: --provider-id="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247661 4553 flags.go:64] FLAG: --qos-reserved="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247670 4553 flags.go:64] FLAG: --read-only-port="10255" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247688 4553 flags.go:64] FLAG: --register-node="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247697 4553 flags.go:64] FLAG: --register-schedulable="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247707 4553 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247721 4553 flags.go:64] FLAG: --registry-burst="10" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247730 4553 flags.go:64] FLAG: --registry-qps="5" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247740 4553 flags.go:64] FLAG: --reserved-cpus="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247749 4553 flags.go:64] FLAG: --reserved-memory="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247784 4553 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247795 4553 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247805 4553 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247814 4553 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247823 4553 flags.go:64] FLAG: --runonce="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247833 4553 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247842 4553 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247852 4553 flags.go:64] FLAG: --seccomp-default="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247861 4553 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247870 4553 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247879 4553 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247889 4553 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247898 4553 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247908 4553 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247918 4553 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247927 4553 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247936 4553 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247947 4553 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247956 4553 flags.go:64] FLAG: --system-cgroups="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247965 4553 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247980 4553 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.247989 4553 flags.go:64] FLAG: --tls-cert-file="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248003 4553 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248014 4553 flags.go:64] FLAG: --tls-min-version="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248025 4553 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248067 4553 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248079 4553 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248089 4553 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248098 4553 flags.go:64] FLAG: --v="2" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248110 4553 flags.go:64] FLAG: --version="false" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248122 4553 flags.go:64] FLAG: --vmodule="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248133 4553 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.248144 4553 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248367 4553 feature_gate.go:330] unrecognized feature gate: Example Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248379 4553 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248387 4553 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248397 4553 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248406 4553 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248414 4553 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248423 4553 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248432 4553 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248440 4553 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248449 4553 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248457 4553 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248466 4553 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248474 4553 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248482 4553 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248490 4553 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248498 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248505 4553 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248513 4553 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248520 4553 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248531 4553 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248541 4553 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248558 4553 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248567 4553 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248578 4553 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248591 4553 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248600 4553 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248608 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248615 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248623 4553 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248630 4553 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248639 4553 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248646 4553 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248654 4553 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248664 4553 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248674 4553 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248682 4553 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248690 4553 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248699 4553 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248707 4553 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248715 4553 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248724 4553 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248732 4553 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248741 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248749 4553 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248757 4553 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248765 4553 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248776 4553 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248784 4553 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248792 4553 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248801 4553 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248809 4553 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248818 4553 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248826 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248836 4553 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248844 4553 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248852 4553 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248863 4553 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248871 4553 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248878 4553 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248888 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248896 4553 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248904 4553 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248912 4553 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248920 4553 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248928 4553 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248935 4553 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248943 4553 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248952 4553 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248959 4553 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248967 4553 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.248977 4553 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.250321 4553 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.262068 4553 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.262124 4553 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262260 4553 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262275 4553 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262285 4553 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262295 4553 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262304 4553 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262312 4553 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262321 4553 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262330 4553 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262339 4553 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262347 4553 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262355 4553 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262362 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262370 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262379 4553 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262387 4553 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262395 4553 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262403 4553 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262410 4553 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262419 4553 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262427 4553 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262436 4553 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262444 4553 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262452 4553 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262459 4553 feature_gate.go:330] unrecognized feature gate: Example Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262467 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262478 4553 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262488 4553 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262496 4553 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262504 4553 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262512 4553 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262520 4553 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262528 4553 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262535 4553 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262543 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262553 4553 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262562 4553 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262570 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262579 4553 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262587 4553 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262596 4553 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262604 4553 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262612 4553 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262621 4553 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262629 4553 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262637 4553 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262645 4553 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262653 4553 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262661 4553 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262669 4553 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262679 4553 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262692 4553 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262701 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262710 4553 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262719 4553 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262727 4553 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262736 4553 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262744 4553 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262751 4553 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262759 4553 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262767 4553 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262775 4553 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262783 4553 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262791 4553 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262799 4553 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262807 4553 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262815 4553 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262822 4553 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262830 4553 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262841 4553 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262850 4553 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.262862 4553 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.262877 4553 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263138 4553 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263152 4553 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263161 4553 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263172 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263180 4553 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263188 4553 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263196 4553 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263204 4553 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263212 4553 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263220 4553 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263228 4553 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263235 4553 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263243 4553 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263251 4553 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263259 4553 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263267 4553 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263275 4553 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263283 4553 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263290 4553 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263299 4553 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263307 4553 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263315 4553 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263323 4553 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263330 4553 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263339 4553 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263346 4553 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263354 4553 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263362 4553 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263370 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263378 4553 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263386 4553 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263393 4553 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263401 4553 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263409 4553 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263858 4553 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263872 4553 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263883 4553 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263893 4553 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263904 4553 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263912 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263921 4553 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263929 4553 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263937 4553 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263945 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263953 4553 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263961 4553 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263969 4553 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263977 4553 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263986 4553 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.263994 4553 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264004 4553 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264014 4553 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264024 4553 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264033 4553 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264066 4553 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264077 4553 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264085 4553 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264094 4553 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264102 4553 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264110 4553 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264118 4553 feature_gate.go:330] unrecognized feature gate: Example Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264126 4553 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264133 4553 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264141 4553 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264148 4553 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264156 4553 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264164 4553 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264172 4553 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264180 4553 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264187 4553 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.264198 4553 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.264213 4553 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.265604 4553 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.271303 4553 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.271483 4553 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.273628 4553 server.go:997] "Starting client certificate rotation" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.273682 4553 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.273901 4553 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-12 14:53:02.716479285 +0000 UTC Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.274016 4553 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2491h20m35.442468431s for next certificate rotation Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.308472 4553 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.315644 4553 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.334902 4553 log.go:25] "Validated CRI v1 runtime API" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.375855 4553 log.go:25] "Validated CRI v1 image API" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.378807 4553 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.384672 4553 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-19-27-00-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.384722 4553 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.397614 4553 manager.go:217] Machine: {Timestamp:2025-09-30 19:32:27.395806232 +0000 UTC m=+0.595308392 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199472640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b BootID:825ea34c-fb99-4283-90cd-f6aa86e2aea9 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039894528 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076107 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599734272 Type:vfs Inodes:3076107 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d8:61:4f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d8:61:4f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ba:1c:27 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5d:ca:40 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b8:ad:a8 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f8:42:63 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7a:04:48:ae:81:1e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:21:0e:b4:2b:d7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199472640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.397936 4553 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.398097 4553 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.399647 4553 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.399868 4553 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.399908 4553 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.401033 4553 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.401129 4553 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.401559 4553 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.401587 4553 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.402547 4553 state_mem.go:36] "Initialized new in-memory state store" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.402672 4553 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.407559 4553 kubelet.go:418] "Attempting to sync node with API server" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.407586 4553 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.407605 4553 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.407623 4553 kubelet.go:324] "Adding apiserver pod source" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.407640 4553 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.417273 4553 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.419332 4553 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.422438 4553 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.423448 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.423585 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.423765 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.423939 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424195 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424227 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424237 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424256 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424272 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424282 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424294 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424312 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424324 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424336 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424350 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.424359 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.425597 4553 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.426264 4553 server.go:1280] "Started kubelet" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.427411 4553 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.427776 4553 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.427715 4553 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.428424 4553 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.428553 4553 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.428589 4553 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.428624 4553 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 23:54:11.129272769 +0000 UTC Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.428712 4553 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2020h21m43.700565052s for next certificate rotation Sep 30 19:32:27 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.428886 4553 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.428901 4553 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.429084 4553 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.430236 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.430500 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.430642 4553 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.430714 4553 server.go:460] "Adding debug handlers to kubelet server" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.432696 4553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="200ms" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.434332 4553 factory.go:153] Registering CRI-O factory Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.434372 4553 factory.go:221] Registration of the crio container factory successfully Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.434462 4553 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.434485 4553 factory.go:55] Registering systemd factory Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.434493 4553 factory.go:221] Registration of the systemd container factory successfully Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.434537 4553 factory.go:103] Registering Raw factory Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.434553 4553 manager.go:1196] Started watching for new ooms in manager Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.435140 4553 manager.go:319] Starting recovery of all containers Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441660 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441719 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441736 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441751 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441766 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441779 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441794 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441808 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441847 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441863 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441877 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441892 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441905 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441944 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441958 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441971 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.441989 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442005 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442020 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442055 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442072 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442087 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442102 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442119 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442139 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442154 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442184 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442201 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442217 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442232 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442246 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442262 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442295 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442308 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442324 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442340 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442354 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442369 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442384 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442398 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442413 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442430 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442444 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442458 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442476 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442492 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442507 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442521 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442537 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442552 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442567 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.442583 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443071 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443295 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443330 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443349 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443377 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443395 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443423 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443438 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443453 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443477 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443494 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443516 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443534 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.443551 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.440850 4553 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a26542da08d95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 19:32:27.426205077 +0000 UTC m=+0.625707217,LastTimestamp:2025-09-30 19:32:27.426205077 +0000 UTC m=+0.625707217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456154 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456360 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456442 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456503 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456525 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456537 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456567 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456583 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456595 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456625 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456643 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456756 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456807 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456820 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456835 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456845 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456861 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.456979 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465353 4553 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465446 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465468 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465480 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465492 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465503 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465513 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465524 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465536 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465547 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465559 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465570 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465582 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465593 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465604 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465614 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465630 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465640 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465656 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465666 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465677 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465732 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465761 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465773 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465787 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465799 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465810 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465821 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465832 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465844 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465856 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465870 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465882 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465891 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465902 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465912 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465923 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465948 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465958 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465969 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465979 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465988 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.465997 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466005 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466015 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466024 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466062 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466073 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466084 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466110 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466125 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466140 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466155 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466169 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466184 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466195 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466208 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466224 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466252 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466263 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466287 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466298 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466323 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466337 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466352 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466363 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466372 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466383 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466393 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466403 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466414 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466424 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466434 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466445 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466456 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466467 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466476 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466487 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466497 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466508 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466519 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466528 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466537 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466549 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466558 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466586 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466596 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466625 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466635 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466645 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466657 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466667 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466677 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466701 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466714 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466737 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466749 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466759 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466768 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466777 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466787 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466812 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466821 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466832 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466842 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466852 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466862 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466872 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466882 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466908 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466918 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466930 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466940 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466950 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466959 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466969 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.466979 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467005 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467014 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467026 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467049 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467059 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467068 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467077 4553 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467088 4553 reconstruct.go:97] "Volume reconstruction finished" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.467097 4553 reconciler.go:26] "Reconciler: start to sync state" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.469754 4553 manager.go:324] Recovery completed Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.481533 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.486769 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.486821 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.486831 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.487839 4553 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.487876 4553 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.487906 4553 state_mem.go:36] "Initialized new in-memory state store" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.500705 4553 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.502731 4553 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.502780 4553 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.502813 4553 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.502944 4553 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.503667 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.503823 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.507335 4553 policy_none.go:49] "None policy: Start" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.508939 4553 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.509000 4553 state_mem.go:35] "Initializing new in-memory state store" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.531353 4553 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.557149 4553 manager.go:334] "Starting Device Plugin manager" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.557586 4553 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.557610 4553 server.go:79] "Starting device plugin registration server" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.558135 4553 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.558155 4553 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.558446 4553 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.558535 4553 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.558545 4553 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.565325 4553 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.604077 4553 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.604239 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.606180 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.606230 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.606246 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.606449 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.606844 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.606929 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.607544 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.607564 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.607573 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.607665 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.608138 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.608183 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.608191 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.608217 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.608232 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609281 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609300 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609307 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609378 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609715 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609751 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609944 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609964 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.609974 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.610063 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.610153 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.610209 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611003 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611055 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611075 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611085 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611093 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611125 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611094 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611159 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611176 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611416 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611449 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611561 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611579 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.611589 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.612697 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.612717 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.612727 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.633965 4553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="400ms" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.658769 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.659829 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.659941 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.660011 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.660154 4553 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.660624 4553 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669182 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669312 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669470 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669555 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669626 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669715 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669804 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669879 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.669949 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.670018 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.670113 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.670184 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.670260 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.670332 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.670399 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771367 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771415 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771435 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771449 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771463 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771478 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771531 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771548 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771564 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771579 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771595 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771586 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771618 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771633 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771637 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771614 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771680 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771696 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771721 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771707 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771714 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771781 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771798 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771819 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771797 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771587 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771817 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771863 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771888 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.771989 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.861837 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.863502 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.863535 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.863544 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.863573 4553 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 19:32:27 crc kubenswrapper[4553]: E0930 19:32:27.863932 4553 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.941728 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.949169 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.962610 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.978235 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: I0930 19:32:27.983342 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.985640 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-1e55cfbf703f7ea009245f69effbc79a0eb4a756761c841e894474136c481d5a WatchSource:0}: Error finding container 1e55cfbf703f7ea009245f69effbc79a0eb4a756761c841e894474136c481d5a: Status 404 returned error can't find the container with id 1e55cfbf703f7ea009245f69effbc79a0eb4a756761c841e894474136c481d5a Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.992707 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-4ecc3799d3feb99131e7cc20339659c93967f9a4feb83542dd5cead595767980 WatchSource:0}: Error finding container 4ecc3799d3feb99131e7cc20339659c93967f9a4feb83542dd5cead595767980: Status 404 returned error can't find the container with id 4ecc3799d3feb99131e7cc20339659c93967f9a4feb83542dd5cead595767980 Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.996674 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2e54cfe5d541b6a199fa6ea355172c56b900884eace2d1959f570405af3e6dcb WatchSource:0}: Error finding container 2e54cfe5d541b6a199fa6ea355172c56b900884eace2d1959f570405af3e6dcb: Status 404 returned error can't find the container with id 2e54cfe5d541b6a199fa6ea355172c56b900884eace2d1959f570405af3e6dcb Sep 30 19:32:27 crc kubenswrapper[4553]: W0930 19:32:27.997832 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e2b9c402d328a7ee2d951e8a6d7932333f1d4fcf112486930ca1e2a4dfcb7c61 WatchSource:0}: Error finding container e2b9c402d328a7ee2d951e8a6d7932333f1d4fcf112486930ca1e2a4dfcb7c61: Status 404 returned error can't find the container with id e2b9c402d328a7ee2d951e8a6d7932333f1d4fcf112486930ca1e2a4dfcb7c61 Sep 30 19:32:28 crc kubenswrapper[4553]: E0930 19:32:28.035092 4553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="800ms" Sep 30 19:32:28 crc kubenswrapper[4553]: E0930 19:32:28.157503 4553 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.17:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a26542da08d95 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 19:32:27.426205077 +0000 UTC m=+0.625707217,LastTimestamp:2025-09-30 19:32:27.426205077 +0000 UTC m=+0.625707217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.264464 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.265953 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.265991 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.266001 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.266024 4553 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 19:32:28 crc kubenswrapper[4553]: E0930 19:32:28.266434 4553 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Sep 30 19:32:28 crc kubenswrapper[4553]: W0930 19:32:28.277017 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:28 crc kubenswrapper[4553]: E0930 19:32:28.277147 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.428975 4553 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.508841 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1e55cfbf703f7ea009245f69effbc79a0eb4a756761c841e894474136c481d5a"} Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.509870 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2b9c402d328a7ee2d951e8a6d7932333f1d4fcf112486930ca1e2a4dfcb7c61"} Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.510842 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2e54cfe5d541b6a199fa6ea355172c56b900884eace2d1959f570405af3e6dcb"} Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.511837 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4ecc3799d3feb99131e7cc20339659c93967f9a4feb83542dd5cead595767980"} Sep 30 19:32:28 crc kubenswrapper[4553]: I0930 19:32:28.512661 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"be013fe37c32a13ace133ea0f3deecafbbda08b4f2284231840f241be44e242c"} Sep 30 19:32:28 crc kubenswrapper[4553]: W0930 19:32:28.550970 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:28 crc kubenswrapper[4553]: E0930 19:32:28.551091 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:28 crc kubenswrapper[4553]: W0930 19:32:28.768392 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:28 crc kubenswrapper[4553]: E0930 19:32:28.768518 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:28 crc kubenswrapper[4553]: W0930 19:32:28.791648 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:28 crc kubenswrapper[4553]: E0930 19:32:28.791743 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:28 crc kubenswrapper[4553]: E0930 19:32:28.835695 4553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="1.6s" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.067261 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.068430 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.068473 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.068486 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.068556 4553 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 19:32:29 crc kubenswrapper[4553]: E0930 19:32:29.069680 4553 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.429637 4553 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.515853 4553 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a" exitCode=0 Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.515933 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a"} Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.516058 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.516973 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.516998 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.517006 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.519378 4553 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed" exitCode=0 Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.519448 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed"} Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.519666 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.521272 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.521300 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.521309 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.523621 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab"} Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.523671 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b"} Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.523686 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528"} Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.523709 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e"} Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.523782 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.525185 4553 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3" exitCode=0 Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.525378 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.525802 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3"} Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.526217 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.526241 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.526249 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.526653 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.526704 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.526729 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.527823 4553 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea" exitCode=0 Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.527854 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea"} Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.527934 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.529068 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.529092 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.529101 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.530281 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.532111 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.532145 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:29 crc kubenswrapper[4553]: I0930 19:32:29.532157 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.015976 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.428795 4553 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:30 crc kubenswrapper[4553]: W0930 19:32:30.429454 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:30 crc kubenswrapper[4553]: E0930 19:32:30.429564 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:30 crc kubenswrapper[4553]: E0930 19:32:30.436993 4553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.17:6443: connect: connection refused" interval="3.2s" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.532653 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b85ac1b239ec48caf99e6b4cad1a9d38866019acf3348f1afa5f1bb361537560"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.532983 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.533122 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.533193 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.533410 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.533080 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.534752 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.534834 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.534887 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.535848 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.535933 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.535993 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.536144 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.536878 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.536952 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.537005 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.538416 4553 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b" exitCode=0 Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.538523 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.538664 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.539266 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.539423 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.539511 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.540256 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.540316 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.540260 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"85ae6eef7edd538ebef5374a4feae9a8889fb7d671191b3cc3d86f083fc60b80"} Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.541381 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.541497 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.541578 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.541383 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.542186 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.542196 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.670249 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.671707 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.671756 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.671769 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:30 crc kubenswrapper[4553]: I0930 19:32:30.671805 4553 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 19:32:30 crc kubenswrapper[4553]: E0930 19:32:30.672389 4553 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.17:6443: connect: connection refused" node="crc" Sep 30 19:32:30 crc kubenswrapper[4553]: W0930 19:32:30.716119 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.17:6443: connect: connection refused Sep 30 19:32:30 crc kubenswrapper[4553]: E0930 19:32:30.716213 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.17:6443: connect: connection refused" logger="UnhandledError" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.546945 4553 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75" exitCode=0 Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.548140 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.548290 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.548528 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.548701 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.547161 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.547187 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75"} Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.549340 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.549395 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.550299 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.550343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.550350 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.550392 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.550410 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.550360 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.552017 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.552085 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.552102 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.554136 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.554300 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.554330 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.554357 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.554535 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:31 crc kubenswrapper[4553]: I0930 19:32:31.554556 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.556023 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9"} Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.556097 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061"} Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.556117 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa"} Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.556135 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9"} Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.882473 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.882687 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.883998 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.884071 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.884093 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:32 crc kubenswrapper[4553]: I0930 19:32:32.890769 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.016366 4553 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.016460 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.566974 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.567500 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.567803 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2"} Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.567842 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.568335 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.568360 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.568370 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.568994 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.569088 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.569110 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.834386 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.872656 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.873748 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.873777 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.873786 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:33 crc kubenswrapper[4553]: I0930 19:32:33.873806 4553 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.571179 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.571206 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.573216 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.573314 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.573344 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.573366 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.573388 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.573446 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.977410 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.977773 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.977880 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.980090 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.980148 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:34 crc kubenswrapper[4553]: I0930 19:32:34.980163 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.133278 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.133484 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.138078 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.138136 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.138151 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.332127 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.574117 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.574980 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.574164 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.576506 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.576551 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.576564 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.577293 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.577370 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:35 crc kubenswrapper[4553]: I0930 19:32:35.577383 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:37 crc kubenswrapper[4553]: E0930 19:32:37.565469 4553 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.774592 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.774816 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.776114 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.776201 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.776221 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.820723 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.820919 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.822286 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.822322 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:37 crc kubenswrapper[4553]: I0930 19:32:37.822334 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:38 crc kubenswrapper[4553]: I0930 19:32:38.321180 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:38 crc kubenswrapper[4553]: I0930 19:32:38.321283 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:38 crc kubenswrapper[4553]: I0930 19:32:38.322656 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:38 crc kubenswrapper[4553]: I0930 19:32:38.322682 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:38 crc kubenswrapper[4553]: I0930 19:32:38.322691 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:39 crc kubenswrapper[4553]: I0930 19:32:39.644152 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 19:32:39 crc kubenswrapper[4553]: I0930 19:32:39.644333 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:39 crc kubenswrapper[4553]: I0930 19:32:39.645525 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:39 crc kubenswrapper[4553]: I0930 19:32:39.645568 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:39 crc kubenswrapper[4553]: I0930 19:32:39.645580 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:41 crc kubenswrapper[4553]: W0930 19:32:41.123141 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.123236 4553 trace.go:236] Trace[1286992009]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 19:32:31.122) (total time: 10001ms): Sep 30 19:32:41 crc kubenswrapper[4553]: Trace[1286992009]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (19:32:41.123) Sep 30 19:32:41 crc kubenswrapper[4553]: Trace[1286992009]: [10.001070714s] [10.001070714s] END Sep 30 19:32:41 crc kubenswrapper[4553]: E0930 19:32:41.123258 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.166362 4553 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38810->192.168.126.11:17697: read: connection reset by peer" start-of-body= Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.166443 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38810->192.168.126.11:17697: read: connection reset by peer" Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.429591 4553 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 19:32:41 crc kubenswrapper[4553]: W0930 19:32:41.520246 4553 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.520327 4553 trace.go:236] Trace[1957250902]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 19:32:31.518) (total time: 10001ms): Sep 30 19:32:41 crc kubenswrapper[4553]: Trace[1957250902]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:32:41.520) Sep 30 19:32:41 crc kubenswrapper[4553]: Trace[1957250902]: [10.001761238s] [10.001761238s] END Sep 30 19:32:41 crc kubenswrapper[4553]: E0930 19:32:41.520344 4553 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.589006 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.590723 4553 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b85ac1b239ec48caf99e6b4cad1a9d38866019acf3348f1afa5f1bb361537560" exitCode=255 Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.590767 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b85ac1b239ec48caf99e6b4cad1a9d38866019acf3348f1afa5f1bb361537560"} Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.590894 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.591612 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.591642 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.591657 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:41 crc kubenswrapper[4553]: I0930 19:32:41.592189 4553 scope.go:117] "RemoveContainer" containerID="b85ac1b239ec48caf99e6b4cad1a9d38866019acf3348f1afa5f1bb361537560" Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.104082 4553 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.104149 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.107551 4553 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.107621 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.594600 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.596617 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082"} Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.596767 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.597595 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.597642 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:42 crc kubenswrapper[4553]: I0930 19:32:42.597658 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:43 crc kubenswrapper[4553]: I0930 19:32:43.016771 4553 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 19:32:43 crc kubenswrapper[4553]: I0930 19:32:43.016845 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.334466 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.334598 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.334689 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.335481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.335510 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.335523 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.338181 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.586974 4553 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.602410 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.603136 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.603185 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.603195 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:45 crc kubenswrapper[4553]: I0930 19:32:45.608711 4553 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 19:32:46 crc kubenswrapper[4553]: I0930 19:32:46.604741 4553 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 19:32:46 crc kubenswrapper[4553]: I0930 19:32:46.605627 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:46 crc kubenswrapper[4553]: I0930 19:32:46.605814 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:46 crc kubenswrapper[4553]: I0930 19:32:46.605942 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.092655 4553 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.101357 4553 trace.go:236] Trace[1086437929]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 19:32:36.330) (total time: 10771ms): Sep 30 19:32:47 crc kubenswrapper[4553]: Trace[1086437929]: ---"Objects listed" error: 10771ms (19:32:47.101) Sep 30 19:32:47 crc kubenswrapper[4553]: Trace[1086437929]: [10.771165607s] [10.771165607s] END Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.101404 4553 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.102301 4553 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.103139 4553 trace.go:236] Trace[966679019]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 19:32:35.446) (total time: 11656ms): Sep 30 19:32:47 crc kubenswrapper[4553]: Trace[966679019]: ---"Objects listed" error: 11656ms (19:32:47.103) Sep 30 19:32:47 crc kubenswrapper[4553]: Trace[966679019]: [11.656329268s] [11.656329268s] END Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.103159 4553 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.112350 4553 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.112971 4553 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.114133 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.114272 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.114365 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.114461 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.114538 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.128824 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.134244 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.134502 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.134579 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.134646 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.134730 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.146119 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.151802 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.152055 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.152217 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.152309 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.152387 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.172521 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.179487 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.179541 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.179555 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.179582 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.179596 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.192253 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.198716 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.198954 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.199020 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.199121 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.199200 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.212103 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.212554 4553 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.214505 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.214645 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.214761 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.214886 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.215008 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.317900 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.317943 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.317952 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.317971 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.317982 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.419060 4553 apiserver.go:52] "Watching apiserver" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.420491 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.420535 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.420550 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.420573 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.420584 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.421892 4553 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.422316 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-p4qgs","openshift-machine-config-operator/machine-config-daemon-9n4dl","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.422879 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.422942 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.422967 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.422979 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.423013 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.422967 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.423117 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.423183 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.423248 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p4qgs" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.423768 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.423820 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.427424 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.427488 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.427514 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.427533 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.427644 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.427912 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.428179 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.428266 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.428467 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.428518 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.428773 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.428939 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.428974 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.429119 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.429220 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.429302 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.430131 4553 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.431975 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.447346 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.462174 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.476770 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.489341 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.501773 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505073 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505131 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505171 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505206 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505244 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505274 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505303 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505331 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505355 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505381 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505407 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505434 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505459 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505492 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505520 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505543 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505572 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505605 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505631 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505661 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505691 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505722 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505744 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505774 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505845 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505875 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.505897 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506645 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506685 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506716 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506741 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506767 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506796 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506823 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506957 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506954 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506987 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.506960 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507013 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507211 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507327 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507360 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507387 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507414 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507439 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507510 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507497 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507766 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507772 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507828 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507866 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507896 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507927 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507958 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507920 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.507985 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508003 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508016 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508063 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508092 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508096 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508130 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508163 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508195 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508221 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508256 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508284 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508297 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508313 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508339 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508369 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508402 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508428 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508458 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508488 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508513 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508545 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508571 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508582 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508601 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508626 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508652 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508681 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508705 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508709 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508736 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508768 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508791 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508825 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508856 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508886 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508895 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.508975 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509016 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509068 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509091 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509119 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509379 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509411 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509440 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509478 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509504 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509529 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509565 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509586 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509613 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509608 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509646 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509673 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509722 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509743 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509785 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509810 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509830 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509867 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509887 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509911 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509957 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509978 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510002 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510070 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510094 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510133 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510156 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510176 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510215 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510243 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510268 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510303 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510326 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510364 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510384 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510407 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510447 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510465 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512133 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512194 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512228 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512265 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512298 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512332 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512365 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512399 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512436 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512470 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512500 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512531 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512556 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512589 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512621 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512648 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512676 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512710 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512775 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512807 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512837 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512867 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512899 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512930 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512962 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512997 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513026 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513076 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513108 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513150 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513181 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513211 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513240 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513271 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513296 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513328 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515704 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515816 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515915 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516022 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516169 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516269 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516385 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516512 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516621 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516724 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516829 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516929 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517031 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517150 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517263 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509664 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523456 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.509944 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510206 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510926 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.511255 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.511428 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.511602 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.511802 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.511945 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.511681 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512201 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512542 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512942 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.512952 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513304 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.513612 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.510983 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.514128 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.514178 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.514471 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.514444 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.514542 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.514677 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.514800 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.514809 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515095 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515201 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515258 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515336 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515424 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515701 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.515933 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516017 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516176 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516324 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516650 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.516737 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517174 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517183 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517274 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517346 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517369 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.517385 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:32:48.017295737 +0000 UTC m=+21.216797867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523931 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524031 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524071 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524159 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524191 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524196 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517999 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.518204 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.518321 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.518428 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524252 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519030 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519122 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519158 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519182 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519242 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519633 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519634 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519818 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.520026 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.520208 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.520287 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.520438 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.520561 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.519404 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.522648 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.522645 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.522859 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.522964 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523099 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523112 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524367 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524428 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524527 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524564 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524613 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524667 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.524803 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.525093 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.525106 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.525147 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.525164 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523799 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.525930 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.517728 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.526114 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.526319 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.526523 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.526854 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.527214 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523548 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.527456 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523017 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.527740 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.527839 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.527925 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.528083 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.528747 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.528100 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.528872 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.528918 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.529230 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.529272 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.529777 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.530477 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.529821 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.530006 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.530294 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.530444 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.530775 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.531163 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.531740 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.531914 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.531943 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.532068 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.532127 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.532416 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.532461 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.532595 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.532966 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.533021 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.533493 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.533497 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.533916 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.533957 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.534278 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.534908 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523934 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535635 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535667 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535822 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535843 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535864 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535885 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535903 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535927 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535947 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535970 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535993 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536058 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536079 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536098 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536115 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536122 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.523923 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535358 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535928 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.535976 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536026 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536273 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536418 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536481 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536530 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536783 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536930 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537030 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537100 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.536133 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537248 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537267 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537285 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537310 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537364 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537381 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537459 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537492 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537511 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537530 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537549 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537559 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537576 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537589 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537596 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537619 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537639 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1e817c67-7688-42d4-8a82-ce72282cbb51-rootfs\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537810 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537997 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e817c67-7688-42d4-8a82-ce72282cbb51-proxy-tls\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538030 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538136 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgqgj\" (UniqueName: \"kubernetes.io/projected/ff92820c-07f5-4503-8d99-5428f5fbecb8-kube-api-access-qgqgj\") pod \"node-resolver-p4qgs\" (UID: \"ff92820c-07f5-4503-8d99-5428f5fbecb8\") " pod="openshift-dns/node-resolver-p4qgs" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538165 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538184 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538224 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ff92820c-07f5-4503-8d99-5428f5fbecb8-hosts-file\") pod \"node-resolver-p4qgs\" (UID: \"ff92820c-07f5-4503-8d99-5428f5fbecb8\") " pod="openshift-dns/node-resolver-p4qgs" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538246 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538267 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538286 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538341 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e817c67-7688-42d4-8a82-ce72282cbb51-mcd-auth-proxy-config\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538360 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhn66\" (UniqueName: \"kubernetes.io/projected/1e817c67-7688-42d4-8a82-ce72282cbb51-kube-api-access-lhn66\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538863 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538880 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538890 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538901 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538915 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.538944 4553 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539228 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539241 4553 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539251 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539265 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539277 4553 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539288 4553 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539300 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539314 4553 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539327 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539336 4553 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539347 4553 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539356 4553 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539365 4553 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539375 4553 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539386 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539395 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539404 4553 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539414 4553 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539424 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539433 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539445 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539456 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539466 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539475 4553 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539485 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539495 4553 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539535 4553 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539543 4553 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539551 4553 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539563 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539573 4553 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539584 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539594 4553 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539603 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539614 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539623 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539632 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539641 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539651 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539660 4553 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539669 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539679 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539688 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539698 4553 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539707 4553 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539717 4553 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539726 4553 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539734 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539744 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539754 4553 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539762 4553 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539770 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539780 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539790 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539800 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539812 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539822 4553 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539831 4553 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539840 4553 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539850 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539858 4553 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539867 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539874 4553 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539883 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.539891 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540174 4553 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540184 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540194 4553 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540203 4553 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540211 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540223 4553 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540232 4553 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540243 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540283 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540297 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540309 4553 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540322 4553 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540331 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540340 4553 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540350 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540359 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540368 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540377 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540409 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540420 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540430 4553 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540494 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540523 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540533 4553 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540541 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540552 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540593 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540604 4553 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540615 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540623 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540633 4553 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540643 4553 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540655 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540666 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540677 4553 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540688 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540699 4553 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540709 4553 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540719 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540728 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540737 4553 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540748 4553 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540757 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540767 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540776 4553 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540785 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540795 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540804 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540813 4553 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540821 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540829 4553 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540839 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540848 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540857 4553 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540846 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.540994 4553 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.541123 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:48.041103819 +0000 UTC m=+21.240605949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.541673 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.537601 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542198 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542212 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.540866 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542345 4553 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542356 4553 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542366 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542376 4553 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542386 4553 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542396 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542404 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542414 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542423 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542431 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542440 4553 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542449 4553 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542459 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542468 4553 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542477 4553 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542485 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542494 4553 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542502 4553 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542511 4553 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542521 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542529 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542540 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542544 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542548 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542578 4553 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542588 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542598 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542609 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542618 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542627 4553 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.542636 4553 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.543108 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.543107 4553 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.543810 4553 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.543852 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:48.043837353 +0000 UTC m=+21.243339483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.553200 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.557552 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.558239 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.558937 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.560134 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.560475 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.560557 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.561424 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.561775 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.562408 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.563176 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.563540 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.563600 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.564741 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.565981 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.566118 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.566134 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.566129 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.566147 4553 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.566258 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:48.066237257 +0000 UTC m=+21.265739387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.567648 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.570979 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.571359 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.571405 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.571450 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.571468 4553 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.571560 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:48.071542119 +0000 UTC m=+21.271044249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.572715 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.573271 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.573518 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.573634 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.574108 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.574313 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.574474 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.574595 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.576662 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.577075 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.577224 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.578097 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.578177 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.579586 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.581128 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.579880 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.580325 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.581053 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.581167 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.581352 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.581460 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.582080 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.581717 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.582448 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.583024 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.583622 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.585001 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.586199 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.587236 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.588558 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.591639 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.592545 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.594623 4553 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.594918 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.596796 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.598299 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.600496 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.601915 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.602443 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.604754 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.605651 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.606377 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.606431 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.606884 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.608166 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.608829 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.609392 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.610432 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.611094 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.611529 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.612148 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.612706 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.613174 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.613729 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.614738 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.615593 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.616440 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.616925 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.617463 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.618536 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.618917 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.619171 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.620287 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.620844 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082"} Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.620915 4553 scope.go:117] "RemoveContainer" containerID="b85ac1b239ec48caf99e6b4cad1a9d38866019acf3348f1afa5f1bb361537560" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.621809 4553 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082" exitCode=255 Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.635597 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.640006 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.643629 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vzlwd"] Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.644214 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-swqk9"] Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.644610 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.644678 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.644764 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646292 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ff92820c-07f5-4503-8d99-5428f5fbecb8-hosts-file\") pod \"node-resolver-p4qgs\" (UID: \"ff92820c-07f5-4503-8d99-5428f5fbecb8\") " pod="openshift-dns/node-resolver-p4qgs" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646350 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e817c67-7688-42d4-8a82-ce72282cbb51-mcd-auth-proxy-config\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646375 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhn66\" (UniqueName: \"kubernetes.io/projected/1e817c67-7688-42d4-8a82-ce72282cbb51-kube-api-access-lhn66\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646404 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646425 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646481 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgqgj\" (UniqueName: \"kubernetes.io/projected/ff92820c-07f5-4503-8d99-5428f5fbecb8-kube-api-access-qgqgj\") pod \"node-resolver-p4qgs\" (UID: \"ff92820c-07f5-4503-8d99-5428f5fbecb8\") " pod="openshift-dns/node-resolver-p4qgs" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646506 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1e817c67-7688-42d4-8a82-ce72282cbb51-rootfs\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646536 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e817c67-7688-42d4-8a82-ce72282cbb51-proxy-tls\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646584 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646598 4553 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646610 4553 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646621 4553 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646636 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646646 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646656 4553 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646668 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646684 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646695 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646706 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646717 4553 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646732 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646744 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646755 4553 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646771 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646782 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646794 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646805 4553 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646821 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646831 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646843 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646855 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646870 4553 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646881 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646892 4553 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646906 4553 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646918 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646929 4553 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646940 4553 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646955 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646965 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646976 4553 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.646989 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.647004 4553 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.647016 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.647027 4553 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.647055 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.654620 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qwr6w"] Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.655117 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ff92820c-07f5-4503-8d99-5428f5fbecb8-hosts-file\") pod \"node-resolver-p4qgs\" (UID: \"ff92820c-07f5-4503-8d99-5428f5fbecb8\") " pod="openshift-dns/node-resolver-p4qgs" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.655825 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fmsrf"] Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.655896 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e817c67-7688-42d4-8a82-ce72282cbb51-mcd-auth-proxy-config\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.656183 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.656250 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.656439 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1e817c67-7688-42d4-8a82-ce72282cbb51-rootfs\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.657170 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.657747 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.658097 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.659029 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.659200 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.659351 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.660281 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.663480 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.664068 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e817c67-7688-42d4-8a82-ce72282cbb51-proxy-tls\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.664164 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.664520 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.668129 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.664610 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.674314 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.664626 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.674351 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.674394 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.674408 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.664728 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.665126 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.665197 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.665261 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.665384 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.685656 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhn66\" (UniqueName: \"kubernetes.io/projected/1e817c67-7688-42d4-8a82-ce72282cbb51-kube-api-access-lhn66\") pod \"machine-config-daemon-9n4dl\" (UID: \"1e817c67-7688-42d4-8a82-ce72282cbb51\") " pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.685817 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgqgj\" (UniqueName: \"kubernetes.io/projected/ff92820c-07f5-4503-8d99-5428f5fbecb8-kube-api-access-qgqgj\") pod \"node-resolver-p4qgs\" (UID: \"ff92820c-07f5-4503-8d99-5428f5fbecb8\") " pod="openshift-dns/node-resolver-p4qgs" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.694684 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.695365 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.696114 4553 scope.go:117] "RemoveContainer" containerID="a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.696387 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.712866 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.729712 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.744316 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.748729 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-log-socket\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.748850 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-bin\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.748932 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-cni-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749054 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-conf-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749159 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749271 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-env-overrides\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749363 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-os-release\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749480 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749587 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn7xc\" (UniqueName: \"kubernetes.io/projected/584c5bac-180e-46de-8e53-6586f27f2cea-kube-api-access-vn7xc\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749681 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-netns\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749770 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-systemd\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749875 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-netd\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.749983 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d6b9396-3666-49a3-9d06-f764a3b39edf-cni-binary-copy\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.750120 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-hostroot\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.750204 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cni-binary-copy\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.750288 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-kubelet\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.750389 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cnibin\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.751533 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.751896 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-node-log\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.751925 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-system-cni-dir\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.751953 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.751989 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-cni-multus\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752003 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-multus-certs\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752050 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752072 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-socket-dir-parent\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752131 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-system-cni-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752163 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-daemon-config\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752203 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752222 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-os-release\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752252 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-cni-bin\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752270 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-config\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752289 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-k8s-cni-cncf-io\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752308 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-netns\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752326 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-kubelet\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752340 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-slash\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752369 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-ovn\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752396 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-etc-kubernetes\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752414 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-cnibin\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752433 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc46\" (UniqueName: \"kubernetes.io/projected/0d6b9396-3666-49a3-9d06-f764a3b39edf-kube-api-access-tpc46\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752454 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovn-node-metrics-cert\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752471 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kltgr\" (UniqueName: \"kubernetes.io/projected/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-kube-api-access-kltgr\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752489 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-script-lib\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752508 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz6qz\" (UniqueName: \"kubernetes.io/projected/4457466e-c6fd-4a2f-8b73-c205c50f90e3-kube-api-access-dz6qz\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752528 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-systemd-units\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752547 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-var-lib-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752563 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-etc-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.752578 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.760361 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.760892 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.770245 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.775488 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.778460 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p4qgs" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.787253 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.787700 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.787709 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.787747 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.787757 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.787253 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.794977 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: W0930 19:32:47.795264 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-538bdc0aea7386472c33f9ad8bd5f1c5c193e7f9ea3a7e62f6edb6c65da626c7 WatchSource:0}: Error finding container 538bdc0aea7386472c33f9ad8bd5f1c5c193e7f9ea3a7e62f6edb6c65da626c7: Status 404 returned error can't find the container with id 538bdc0aea7386472c33f9ad8bd5f1c5c193e7f9ea3a7e62f6edb6c65da626c7 Sep 30 19:32:47 crc kubenswrapper[4553]: W0930 19:32:47.803897 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-856487ce070a09f396948005b74154467deb0021e4f399025118bd8f93a6cb85 WatchSource:0}: Error finding container 856487ce070a09f396948005b74154467deb0021e4f399025118bd8f93a6cb85: Status 404 returned error can't find the container with id 856487ce070a09f396948005b74154467deb0021e4f399025118bd8f93a6cb85 Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.808169 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: W0930 19:32:47.809060 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e817c67_7688_42d4_8a82_ce72282cbb51.slice/crio-7b36ae4fb8234c34c817d9167914b6b86f00f5b6f46ea71ec23cb00d4abf7f7e WatchSource:0}: Error finding container 7b36ae4fb8234c34c817d9167914b6b86f00f5b6f46ea71ec23cb00d4abf7f7e: Status 404 returned error can't find the container with id 7b36ae4fb8234c34c817d9167914b6b86f00f5b6f46ea71ec23cb00d4abf7f7e Sep 30 19:32:47 crc kubenswrapper[4553]: W0930 19:32:47.819360 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ef8e23299c93e5bf54c334d230a0dcf5dc8c019d9ce63411df5498dcd64c7536 WatchSource:0}: Error finding container ef8e23299c93e5bf54c334d230a0dcf5dc8c019d9ce63411df5498dcd64c7536: Status 404 returned error can't find the container with id ef8e23299c93e5bf54c334d230a0dcf5dc8c019d9ce63411df5498dcd64c7536 Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.821060 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.831991 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.845526 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853690 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz6qz\" (UniqueName: \"kubernetes.io/projected/4457466e-c6fd-4a2f-8b73-c205c50f90e3-kube-api-access-dz6qz\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853731 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kltgr\" (UniqueName: \"kubernetes.io/projected/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-kube-api-access-kltgr\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853753 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-script-lib\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853773 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-systemd-units\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853790 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-var-lib-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853805 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-etc-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853824 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853841 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-log-socket\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853857 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-bin\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853897 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-cni-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853915 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-conf-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853950 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853964 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-env-overrides\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.853990 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-os-release\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854010 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854024 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn7xc\" (UniqueName: \"kubernetes.io/projected/584c5bac-180e-46de-8e53-6586f27f2cea-kube-api-access-vn7xc\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854057 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-netns\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854073 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-systemd\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854088 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-netd\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854094 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-bin\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854103 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d6b9396-3666-49a3-9d06-f764a3b39edf-cni-binary-copy\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854157 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cni-binary-copy\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854191 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-hostroot\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854595 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-kubelet\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854648 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cnibin\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854669 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-node-log\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854678 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0d6b9396-3666-49a3-9d06-f764a3b39edf-cni-binary-copy\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854690 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-system-cni-dir\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854707 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854734 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-socket-dir-parent\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854750 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-cni-multus\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854770 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-multus-certs\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854786 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854812 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-system-cni-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854850 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-cni-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854865 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-conf-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.854960 4553 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: E0930 19:32:47.855008 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs podName:584c5bac-180e-46de-8e53-6586f27f2cea nodeName:}" failed. No retries permitted until 2025-09-30 19:32:48.354995155 +0000 UTC m=+21.554497285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs") pod "network-metrics-daemon-swqk9" (UID: "584c5bac-180e-46de-8e53-6586f27f2cea") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855199 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855375 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-netns\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855436 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-systemd\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855461 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-netd\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855486 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-kubelet\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855619 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-env-overrides\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855675 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-os-release\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855706 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855727 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-systemd-units\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855748 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-var-lib-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855769 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-etc-openvswitch\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855801 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-socket-dir-parent\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855826 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cnibin\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855848 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-node-log\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.855871 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-system-cni-dir\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.856387 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.856426 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-multus-certs\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.856451 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-cni-multus\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.856600 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-hostroot\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.856776 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-log-socket\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.856885 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-cni-binary-copy\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857370 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857597 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-script-lib\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.854828 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-daemon-config\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857679 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857696 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-os-release\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857756 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-ovn-kubernetes\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857809 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-os-release\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857825 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0d6b9396-3666-49a3-9d06-f764a3b39edf-multus-daemon-config\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857712 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-cni-bin\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857860 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-var-lib-cni-bin\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857867 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-config\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857894 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-k8s-cni-cncf-io\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857910 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-netns\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857926 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-kubelet\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857942 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-slash\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857957 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-ovn\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857967 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-netns\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857971 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-etc-kubernetes\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.857988 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-etc-kubernetes\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858002 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovn-node-metrics-cert\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858020 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-cnibin\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858050 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-slash\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858063 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc46\" (UniqueName: \"kubernetes.io/projected/0d6b9396-3666-49a3-9d06-f764a3b39edf-kube-api-access-tpc46\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858091 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-kubelet\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858129 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-cnibin\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858100 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858161 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-host-run-k8s-cni-cncf-io\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858024 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-ovn\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.858660 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-config\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.861810 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovn-node-metrics-cert\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.862512 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0d6b9396-3666-49a3-9d06-f764a3b39edf-system-cni-dir\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.870219 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz6qz\" (UniqueName: \"kubernetes.io/projected/4457466e-c6fd-4a2f-8b73-c205c50f90e3-kube-api-access-dz6qz\") pod \"ovnkube-node-fmsrf\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.872199 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.876820 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn7xc\" (UniqueName: \"kubernetes.io/projected/584c5bac-180e-46de-8e53-6586f27f2cea-kube-api-access-vn7xc\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.877247 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc46\" (UniqueName: \"kubernetes.io/projected/0d6b9396-3666-49a3-9d06-f764a3b39edf-kube-api-access-tpc46\") pod \"multus-vzlwd\" (UID: \"0d6b9396-3666-49a3-9d06-f764a3b39edf\") " pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.880722 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kltgr\" (UniqueName: \"kubernetes.io/projected/8b7b8059-b38b-4faf-8a46-ad5a8489cf21-kube-api-access-kltgr\") pod \"multus-additional-cni-plugins-qwr6w\" (UID: \"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\") " pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.886722 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.895105 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.895155 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.895168 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.895191 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.895202 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.902428 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.923248 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b85ac1b239ec48caf99e6b4cad1a9d38866019acf3348f1afa5f1bb361537560\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:41Z\\\",\\\"message\\\":\\\"W0930 19:32:30.578356 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 19:32:30.578806 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759260750 cert, and key in /tmp/serving-cert-2428047843/serving-signer.crt, /tmp/serving-cert-2428047843/serving-signer.key\\\\nI0930 19:32:30.862737 1 observer_polling.go:159] Starting file observer\\\\nW0930 19:32:30.865823 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 19:32:30.867896 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:30.869233 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2428047843/tls.crt::/tmp/serving-cert-2428047843/tls.key\\\\\\\"\\\\nF0930 19:32:41.158489 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.935489 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.954576 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.978991 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vzlwd" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.987081 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.995104 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.997789 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.997849 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.997883 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.997904 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:47 crc kubenswrapper[4553]: I0930 19:32:47.997917 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:47Z","lastTransitionTime":"2025-09-30T19:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.060008 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.060215 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:32:49.060175939 +0000 UTC m=+22.259678059 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.060291 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.060356 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.060464 4553 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.060539 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:49.060519019 +0000 UTC m=+22.260021339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.060540 4553 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.060621 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:49.060601091 +0000 UTC m=+22.260103211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.105419 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.105454 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.105463 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.105480 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.105496 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.161534 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.161618 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.161857 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.161884 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.161897 4553 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.161980 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:49.161961444 +0000 UTC m=+22.361463574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.162121 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.162136 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.162176 4553 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.162205 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:49.16219669 +0000 UTC m=+22.361698810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.212415 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.212471 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.212483 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.212505 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.212519 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.314880 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.315130 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.315223 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.315286 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.315367 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.363202 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.363352 4553 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.363396 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs podName:584c5bac-180e-46de-8e53-6586f27f2cea nodeName:}" failed. No retries permitted until 2025-09-30 19:32:49.363383307 +0000 UTC m=+22.562885437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs") pod "network-metrics-daemon-swqk9" (UID: "584c5bac-180e-46de-8e53-6586f27f2cea") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.417979 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.418070 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.418086 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.418110 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.418123 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.521499 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.521549 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.521562 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.521586 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.521597 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.625157 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.625194 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.625206 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.625224 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.625237 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.627615 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzlwd" event={"ID":"0d6b9396-3666-49a3-9d06-f764a3b39edf","Type":"ContainerStarted","Data":"f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.627649 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzlwd" event={"ID":"0d6b9396-3666-49a3-9d06-f764a3b39edf","Type":"ContainerStarted","Data":"cfeedf940e140c9d913b86977363f6c8347aa851b65f3b7f90664fe2ec787fef"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.639937 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.642441 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b85ac1b239ec48caf99e6b4cad1a9d38866019acf3348f1afa5f1bb361537560\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:41Z\\\",\\\"message\\\":\\\"W0930 19:32:30.578356 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 19:32:30.578806 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759260750 cert, and key in /tmp/serving-cert-2428047843/serving-signer.crt, /tmp/serving-cert-2428047843/serving-signer.key\\\\nI0930 19:32:30.862737 1 observer_polling.go:159] Starting file observer\\\\nW0930 19:32:30.865823 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 19:32:30.867896 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:30.869233 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2428047843/tls.crt::/tmp/serving-cert-2428047843/tls.key\\\\\\\"\\\\nF0930 19:32:41.158489 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.643477 4553 scope.go:117] "RemoveContainer" containerID="a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082" Sep 30 19:32:48 crc kubenswrapper[4553]: E0930 19:32:48.643620 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.645614 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54" exitCode=0 Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.645688 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.645723 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"12d3a586bfcc7f16a8463399ff40c0805db03877450618a1169429a3a8f70985"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.646903 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p4qgs" event={"ID":"ff92820c-07f5-4503-8d99-5428f5fbecb8","Type":"ContainerStarted","Data":"9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.646927 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p4qgs" event={"ID":"ff92820c-07f5-4503-8d99-5428f5fbecb8","Type":"ContainerStarted","Data":"9a7f7e5bf209befefe3ceaf07bf2baa823eebb6cc2bff0a3cd0f8b5e968c99fa"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.650215 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.650242 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.650273 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"7b36ae4fb8234c34c817d9167914b6b86f00f5b6f46ea71ec23cb00d4abf7f7e"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.652331 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerStarted","Data":"242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.652624 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerStarted","Data":"1e4beeaf212fcc4dfde4e0fb1b8a94510f2a7a6379ce77abba2aa80c4c894a5c"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.656136 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ef8e23299c93e5bf54c334d230a0dcf5dc8c019d9ce63411df5498dcd64c7536"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.657462 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.658050 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.658110 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"538bdc0aea7386472c33f9ad8bd5f1c5c193e7f9ea3a7e62f6edb6c65da626c7"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.660508 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.660539 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.660551 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"856487ce070a09f396948005b74154467deb0021e4f399025118bd8f93a6cb85"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.691492 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.707484 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.728434 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.730830 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.730890 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.731002 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.731024 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.731055 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.753811 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.780220 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.800931 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.818687 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.835683 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.835733 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.835744 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.835764 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.835777 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.857912 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.874242 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.889633 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.905952 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.938870 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.938927 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.938937 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.938958 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.938969 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:48Z","lastTransitionTime":"2025-09-30T19:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.942078 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:48 crc kubenswrapper[4553]: I0930 19:32:48.991815 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.011843 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.027288 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.040763 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.040790 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.040799 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.040812 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.040821 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.043332 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.056752 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.068063 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.069339 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.069433 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.069539 4553 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.069539 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:32:51.069511782 +0000 UTC m=+24.269013912 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.069610 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.069744 4553 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.069761 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:51.069716198 +0000 UTC m=+24.269218328 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.069814 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:51.06980179 +0000 UTC m=+24.269304050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.079017 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.090586 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.102482 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.116106 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.125593 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.143199 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.143242 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.143253 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.143301 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.143314 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.143817 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.170756 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.170911 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.170880 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.170941 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.170954 4553 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.171014 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:51.17099513 +0000 UTC m=+24.370497350 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.171081 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.171103 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.171116 4553 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.171173 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:51.171158294 +0000 UTC m=+24.370660504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.245839 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.245864 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.245872 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.245884 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.245893 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.348097 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.348137 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.348147 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.348160 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.348170 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.373418 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.373585 4553 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.373660 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs podName:584c5bac-180e-46de-8e53-6586f27f2cea nodeName:}" failed. No retries permitted until 2025-09-30 19:32:51.373641835 +0000 UTC m=+24.573143965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs") pod "network-metrics-daemon-swqk9" (UID: "584c5bac-180e-46de-8e53-6586f27f2cea") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.450859 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.450890 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.450899 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.450915 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.450927 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.503509 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.503565 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.503528 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.503509 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.503643 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.503729 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.503779 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:32:49 crc kubenswrapper[4553]: E0930 19:32:49.503925 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.507985 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.508698 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.511176 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.511809 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.514318 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.515761 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.516927 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.518311 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.519262 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.519991 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.521014 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.522243 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.523361 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.524002 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.525155 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.525883 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.527009 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.527789 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.554142 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.554188 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.554203 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.554224 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.554239 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.657313 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.657802 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.657813 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.657831 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.657843 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.666759 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.666804 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.666816 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.666825 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.666834 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.668338 4553 generic.go:334] "Generic (PLEG): container finished" podID="8b7b8059-b38b-4faf-8a46-ad5a8489cf21" containerID="242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244" exitCode=0 Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.668379 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerDied","Data":"242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.668404 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerStarted","Data":"046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.683023 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.683525 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.698523 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.699516 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.713807 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.736305 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.759087 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.762028 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.762081 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.762092 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.762109 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.762121 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.768859 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.785657 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.797508 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.810231 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.819898 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.831381 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.840011 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.857483 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.863790 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.863809 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.863817 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.863830 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.863838 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.871597 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.885821 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.899015 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.910871 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.923094 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.947455 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.961102 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.965758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.965790 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.965801 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.965818 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.965831 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:49Z","lastTransitionTime":"2025-09-30T19:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.972933 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:49 crc kubenswrapper[4553]: I0930 19:32:49.989753 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:49Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.002553 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.014000 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.020456 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.024285 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.027116 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.029647 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.042964 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.043684 4553 scope.go:117] "RemoveContainer" containerID="a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082" Sep 30 19:32:50 crc kubenswrapper[4553]: E0930 19:32:50.043857 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.045168 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.057431 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.067387 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.067416 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.067427 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.067442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.067454 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.076516 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.089245 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.103315 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.115567 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.131496 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.143734 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.166108 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.169981 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.170025 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.170065 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.170087 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.170101 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.181268 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.192485 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.220374 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.233314 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.246656 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.262121 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.276421 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.276475 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.276489 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.276508 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.276527 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.280460 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.310148 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.351720 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.378789 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.378830 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.378842 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.378857 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.378868 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.391362 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-46cs9"] Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.391673 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.394373 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.400587 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.421102 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.440554 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.478814 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.481475 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.481522 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.481536 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.481551 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.481562 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.484933 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6gc8\" (UniqueName: \"kubernetes.io/projected/1baa362f-a5ec-4459-9108-66da9e4195de-kube-api-access-z6gc8\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.484968 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1baa362f-a5ec-4459-9108-66da9e4195de-host\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.484986 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1baa362f-a5ec-4459-9108-66da9e4195de-serviceca\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.509256 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.550822 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.583640 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.583672 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.583682 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.583695 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.583705 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.586143 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6gc8\" (UniqueName: \"kubernetes.io/projected/1baa362f-a5ec-4459-9108-66da9e4195de-kube-api-access-z6gc8\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.586254 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1baa362f-a5ec-4459-9108-66da9e4195de-host\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.586341 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1baa362f-a5ec-4459-9108-66da9e4195de-host\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.586394 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1baa362f-a5ec-4459-9108-66da9e4195de-serviceca\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.587638 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1baa362f-a5ec-4459-9108-66da9e4195de-serviceca\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.589756 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.616856 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6gc8\" (UniqueName: \"kubernetes.io/projected/1baa362f-a5ec-4459-9108-66da9e4195de-kube-api-access-z6gc8\") pod \"node-ca-46cs9\" (UID: \"1baa362f-a5ec-4459-9108-66da9e4195de\") " pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.651539 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.674124 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.676263 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.677955 4553 generic.go:334] "Generic (PLEG): container finished" podID="8b7b8059-b38b-4faf-8a46-ad5a8489cf21" containerID="046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66" exitCode=0 Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.678446 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerDied","Data":"046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.685865 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.685981 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.686111 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.686184 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.686249 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.705585 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.706119 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-46cs9" Sep 30 19:32:50 crc kubenswrapper[4553]: W0930 19:32:50.717342 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1baa362f_a5ec_4459_9108_66da9e4195de.slice/crio-6d24d1cb6ddd182e05b1d55896b85e5a0e8e3b928a0e30afc73c70db12d0b7c3 WatchSource:0}: Error finding container 6d24d1cb6ddd182e05b1d55896b85e5a0e8e3b928a0e30afc73c70db12d0b7c3: Status 404 returned error can't find the container with id 6d24d1cb6ddd182e05b1d55896b85e5a0e8e3b928a0e30afc73c70db12d0b7c3 Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.733443 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.772519 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.790153 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.790184 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.790192 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.790206 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.790215 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.816091 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.849697 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.890148 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.892618 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.892655 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.892665 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.892681 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.892691 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.929931 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.968722 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:50Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.994870 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.994898 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.994906 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.994919 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:50 crc kubenswrapper[4553]: I0930 19:32:50.994928 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:50Z","lastTransitionTime":"2025-09-30T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.008880 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.061429 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.089740 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.089863 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.089887 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.089941 4553 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.089946 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:32:55.089918076 +0000 UTC m=+28.289420206 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.089990 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:55.089975678 +0000 UTC m=+28.289477808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.090025 4553 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.090086 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:55.09007504 +0000 UTC m=+28.289577160 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.094048 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.097556 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.097580 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.097588 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.097600 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.097609 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.128061 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.167875 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.191207 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.191271 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.191399 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.191424 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.191439 4553 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.191440 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.191477 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.191492 4553 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.191493 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:55.191479116 +0000 UTC m=+28.390981256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.191571 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 19:32:55.191551208 +0000 UTC m=+28.391053448 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.199133 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.199164 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.199183 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.199197 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.199206 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.208244 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.250147 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.289416 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.300933 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.300963 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.300973 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.300987 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.300997 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.329507 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.382880 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.392652 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.392823 4553 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.392892 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs podName:584c5bac-180e-46de-8e53-6586f27f2cea nodeName:}" failed. No retries permitted until 2025-09-30 19:32:55.392876058 +0000 UTC m=+28.592378188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs") pod "network-metrics-daemon-swqk9" (UID: "584c5bac-180e-46de-8e53-6586f27f2cea") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.403018 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.403070 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.403080 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.403092 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.403101 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.417271 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.456699 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.494223 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.503101 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.503140 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.503109 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.503287 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.503311 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.503436 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.503542 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:32:51 crc kubenswrapper[4553]: E0930 19:32:51.503662 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.506942 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.506971 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.506983 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.506997 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.507010 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.533163 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.569002 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.609106 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.609141 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.609151 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.609167 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.609182 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.610183 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.649097 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.683592 4553 generic.go:334] "Generic (PLEG): container finished" podID="8b7b8059-b38b-4faf-8a46-ad5a8489cf21" containerID="2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4" exitCode=0 Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.683623 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerDied","Data":"2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.686238 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-46cs9" event={"ID":"1baa362f-a5ec-4459-9108-66da9e4195de","Type":"ContainerStarted","Data":"2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.686266 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-46cs9" event={"ID":"1baa362f-a5ec-4459-9108-66da9e4195de","Type":"ContainerStarted","Data":"6d24d1cb6ddd182e05b1d55896b85e5a0e8e3b928a0e30afc73c70db12d0b7c3"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.700817 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.712149 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.712196 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.712209 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.712228 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.712240 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.732605 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.769283 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.812914 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.814783 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.814822 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.814833 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.814849 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.814860 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.852381 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.894836 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.917073 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.917110 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.917121 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.917172 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.917185 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:51Z","lastTransitionTime":"2025-09-30T19:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.931873 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:51 crc kubenswrapper[4553]: I0930 19:32:51.979184 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:51Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.020852 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.020909 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.020929 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.020954 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.020971 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.030907 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.058971 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.091575 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.123492 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.123531 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.123542 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.123558 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.123580 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.140155 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.170219 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.212599 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.225669 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.225721 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.225736 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.225756 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.225771 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.251390 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.328343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.328391 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.327715 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.328404 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.328563 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.328583 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.344301 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.370928 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.430857 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.431157 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.431321 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.431427 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.431445 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.533705 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.533745 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.533757 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.533771 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.533784 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.635992 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.636049 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.636060 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.636078 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.636088 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.692784 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.698892 4553 generic.go:334] "Generic (PLEG): container finished" podID="8b7b8059-b38b-4faf-8a46-ad5a8489cf21" containerID="0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc" exitCode=0 Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.698945 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerDied","Data":"0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.711557 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.722328 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.732980 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.737816 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.737872 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.737885 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.737907 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.737919 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.742426 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.753643 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.764328 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.777784 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.790564 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.802851 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.813888 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.827743 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.840121 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.840168 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.840184 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.840204 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.840219 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.849238 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.901506 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.931501 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.943153 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.943205 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.943222 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.943244 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.943261 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:52Z","lastTransitionTime":"2025-09-30T19:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:52 crc kubenswrapper[4553]: I0930 19:32:52.969999 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:52Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.014424 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.045319 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.045359 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.045371 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.045386 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.045397 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.147881 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.147928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.147940 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.147958 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.147972 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.250594 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.250629 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.250637 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.250650 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.250658 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.352436 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.352481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.352496 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.352512 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.352525 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.454970 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.455005 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.455015 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.455030 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.455063 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.503431 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.503458 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.503526 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:53 crc kubenswrapper[4553]: E0930 19:32:53.503631 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.503925 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:53 crc kubenswrapper[4553]: E0930 19:32:53.504008 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:32:53 crc kubenswrapper[4553]: E0930 19:32:53.504107 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:32:53 crc kubenswrapper[4553]: E0930 19:32:53.504174 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.557263 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.557295 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.557306 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.557321 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.557333 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.659808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.659850 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.659861 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.659877 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.659889 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.704784 4553 generic.go:334] "Generic (PLEG): container finished" podID="8b7b8059-b38b-4faf-8a46-ad5a8489cf21" containerID="31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0" exitCode=0 Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.704831 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerDied","Data":"31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.740302 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.760866 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.763279 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.763320 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.763330 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.763347 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.763358 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.772641 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.794960 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.810863 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.822050 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.834859 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.845622 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.858401 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.866262 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.866288 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.866296 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.866309 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.866318 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.873791 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.888640 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.900535 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.912885 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.924779 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.936102 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.947531 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:53Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.967758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.967782 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.967791 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.967804 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:53 crc kubenswrapper[4553]: I0930 19:32:53.967812 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:53Z","lastTransitionTime":"2025-09-30T19:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.070452 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.070500 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.070515 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.070540 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.070557 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.173355 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.173394 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.173406 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.173422 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.173433 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.275779 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.276077 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.276087 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.276106 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.276115 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.378176 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.378211 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.378220 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.378235 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.378243 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.480862 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.480935 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.480952 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.480976 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.480993 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.583883 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.583923 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.583935 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.583955 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.583964 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.686652 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.686717 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.686733 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.686759 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.686777 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.716791 4553 generic.go:334] "Generic (PLEG): container finished" podID="8b7b8059-b38b-4faf-8a46-ad5a8489cf21" containerID="c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9" exitCode=0 Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.716871 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerDied","Data":"c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.745168 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.756786 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.777506 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.789871 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.789897 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.789928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.789943 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.789952 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.791367 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.810910 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.825793 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.838267 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.853160 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.869005 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.883115 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.894502 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.894675 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.894758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.894851 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.894938 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.900189 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.911559 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.930778 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.948172 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.963724 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.973718 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:54Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.997972 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.998005 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.998019 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.998119 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:54 crc kubenswrapper[4553]: I0930 19:32:54.998137 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:54Z","lastTransitionTime":"2025-09-30T19:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.100455 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.100488 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.100496 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.100509 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.100520 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.150456 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.150652 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.150693 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.150804 4553 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.150871 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:03.150850247 +0000 UTC m=+36.350352387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.150955 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:33:03.15094457 +0000 UTC m=+36.350446710 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.151533 4553 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.151586 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:03.151572066 +0000 UTC m=+36.351074206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.202565 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.202601 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.202613 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.202629 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.202640 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.252105 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.252200 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.252346 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.252390 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.252406 4553 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.252455 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:03.252440527 +0000 UTC m=+36.451942647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.252350 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.252509 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.252525 4553 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.252584 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:03.25256792 +0000 UTC m=+36.452070060 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.305600 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.305635 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.305643 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.305657 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.305666 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.407578 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.407632 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.407644 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.407660 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.407672 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.453838 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.454005 4553 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.454077 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs podName:584c5bac-180e-46de-8e53-6586f27f2cea nodeName:}" failed. No retries permitted until 2025-09-30 19:33:03.454061425 +0000 UTC m=+36.653563555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs") pod "network-metrics-daemon-swqk9" (UID: "584c5bac-180e-46de-8e53-6586f27f2cea") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.503492 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.503502 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.503591 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.504196 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.503697 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.504277 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.503995 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:32:55 crc kubenswrapper[4553]: E0930 19:32:55.504337 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.510497 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.510544 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.510560 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.510581 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.510600 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.613527 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.613593 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.613608 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.613633 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.613657 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.716255 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.716299 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.716310 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.716327 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.716339 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.730889 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.732391 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.732479 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.737952 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" event={"ID":"8b7b8059-b38b-4faf-8a46-ad5a8489cf21","Type":"ContainerStarted","Data":"c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.746939 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.762583 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.767371 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.767798 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.777413 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.794406 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.812628 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.818279 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.818418 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.818565 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.818680 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.818805 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.835293 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.849722 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.864506 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.887147 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.906685 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.922455 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.922510 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.922529 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.922554 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.922571 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:55Z","lastTransitionTime":"2025-09-30T19:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.928844 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.942919 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.953788 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.967477 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.978309 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:55 crc kubenswrapper[4553]: I0930 19:32:55.992277 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:55Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.002944 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.014902 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.025978 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.026186 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.026258 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.026331 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.026414 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.027273 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.039646 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.053795 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.069519 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.089395 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.120694 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.128259 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.128313 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.128327 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.128346 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.128358 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.139579 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.167656 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.179594 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.190213 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.199354 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.214617 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.230329 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.230503 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.230570 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.230637 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.230697 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.231645 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.242704 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:56Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.332179 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.332389 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.332453 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.332512 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.332568 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.434886 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.434915 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.434924 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.434936 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.434945 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.536919 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.536957 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.536966 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.536977 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.536985 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.639166 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.639220 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.639228 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.639240 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.639249 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.740261 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.740597 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.740641 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.740658 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.740677 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.740691 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.843199 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.843223 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.843231 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.843243 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.843252 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.945258 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.945446 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.945523 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.945580 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:56 crc kubenswrapper[4553]: I0930 19:32:56.945631 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:56Z","lastTransitionTime":"2025-09-30T19:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.048811 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.049222 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.049417 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.049585 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.049742 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.151798 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.152103 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.152352 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.152501 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.152646 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.244344 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.244592 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.244781 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.244928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.245081 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.256839 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.260131 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.260304 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.260471 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.260545 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.260631 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.280689 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.284139 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.284162 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.284171 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.284183 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.284192 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.296556 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.300138 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.300162 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.300170 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.300183 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.300191 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.315701 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.319904 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.320145 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.320264 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.320603 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.320705 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.334928 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.335333 4553 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.336596 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.336728 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.336861 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.336967 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.337092 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.439355 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.439422 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.439432 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.439444 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.439453 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.503233 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.503290 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.503347 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.503589 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.503652 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.503721 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.503757 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:57 crc kubenswrapper[4553]: E0930 19:32:57.503808 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.516515 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.531481 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.541530 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.541742 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.541825 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.542014 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.542186 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.546894 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.559001 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.587771 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.611173 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.630658 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.640729 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.644530 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.644550 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.644558 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.644570 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.644579 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.657798 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.668831 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.681721 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.696110 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.706590 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.718262 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.727671 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.740788 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.747112 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.747144 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.747156 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.747172 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.747192 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.749405 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/0.log" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.751800 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392" exitCode=1 Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.751836 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.752474 4553 scope.go:117] "RemoveContainer" containerID="1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.765524 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.777660 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.792827 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.806963 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.822614 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.835144 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.848646 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.850368 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.850406 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.850420 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.850437 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.850448 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.864527 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.884987 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.900407 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.912990 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.935148 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"r removal\\\\nI0930 19:32:57.684530 5772 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 19:32:57.684538 5772 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 19:32:57.684539 5772 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 19:32:57.684539 5772 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 19:32:57.684554 5772 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 19:32:57.684572 5772 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 19:32:57.684611 5772 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 19:32:57.684596 5772 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 19:32:57.684630 5772 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 19:32:57.684643 5772 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 19:32:57.684647 5772 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 19:32:57.684654 5772 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 19:32:57.684688 5772 factory.go:656] Stopping watch factory\\\\nI0930 19:32:57.684701 5772 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 19:32:57.684708 5772 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 19:32:57.684716 5772 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.948741 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.952341 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.952381 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.952396 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.952416 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.952433 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:57Z","lastTransitionTime":"2025-09-30T19:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.965427 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.978531 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:57 crc kubenswrapper[4553]: I0930 19:32:57.990171 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:57Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.057917 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.057948 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.057956 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.057972 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.057981 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.160486 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.160523 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.160533 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.160547 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.160559 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.262787 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.262832 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.262845 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.262861 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.262873 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.364539 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.364583 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.364598 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.364615 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.364626 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.467261 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.467326 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.467343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.467370 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.467392 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.570538 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.570584 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.570603 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.570623 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.570638 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.673169 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.673217 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.673232 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.673252 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.673266 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.758158 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/1.log" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.760088 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/0.log" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.763523 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6" exitCode=1 Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.763562 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.763593 4553 scope.go:117] "RemoveContainer" containerID="1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.768368 4553 scope.go:117] "RemoveContainer" containerID="4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6" Sep 30 19:32:58 crc kubenswrapper[4553]: E0930 19:32:58.768662 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.776662 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.776704 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.776721 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.776744 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.776763 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.791121 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.808167 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.827949 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.851772 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.872198 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.879816 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.880163 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.880655 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.880887 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.881104 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.886553 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.917297 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e7d6fbd92da4993fcea298795f8d34dd544278cb4a1ed25f1ca6c063dc79392\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:57Z\\\",\\\"message\\\":\\\"r removal\\\\nI0930 19:32:57.684530 5772 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 19:32:57.684538 5772 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 19:32:57.684539 5772 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 19:32:57.684539 5772 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 19:32:57.684554 5772 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 19:32:57.684572 5772 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 19:32:57.684611 5772 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 19:32:57.684596 5772 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 19:32:57.684630 5772 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0930 19:32:57.684643 5772 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 19:32:57.684647 5772 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0930 19:32:57.684654 5772 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 19:32:57.684688 5772 factory.go:656] Stopping watch factory\\\\nI0930 19:32:57.684701 5772 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 19:32:57.684708 5772 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 19:32:57.684716 5772 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.949443 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.965084 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.985022 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.985075 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.985280 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.985294 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.985313 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:58 crc kubenswrapper[4553]: I0930 19:32:58.985325 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:58Z","lastTransitionTime":"2025-09-30T19:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.000392 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:58Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.013559 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.025267 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.036069 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.049669 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.079509 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.088090 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.088301 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.088368 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.088442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.088502 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.189964 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.190062 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.190082 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.190103 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.190118 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.292943 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.292971 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.292979 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.292991 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.292999 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.395353 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.395403 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.395421 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.395450 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.395470 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.497569 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.497628 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.497645 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.497671 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.497688 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.503940 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.503982 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.504118 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:32:59 crc kubenswrapper[4553]: E0930 19:32:59.504238 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.504161 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:32:59 crc kubenswrapper[4553]: E0930 19:32:59.504118 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:32:59 crc kubenswrapper[4553]: E0930 19:32:59.504484 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:32:59 crc kubenswrapper[4553]: E0930 19:32:59.504814 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.600011 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.600072 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.600084 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.600099 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.600110 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.703127 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.703581 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.703778 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.703980 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.704183 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.771248 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/1.log" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.777853 4553 scope.go:117] "RemoveContainer" containerID="4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6" Sep 30 19:32:59 crc kubenswrapper[4553]: E0930 19:32:59.778196 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.798856 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.807970 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.808112 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.808182 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.808213 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.808239 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.821506 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.835244 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.850639 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.863862 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.884076 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.899185 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.911072 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.911117 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.911128 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.911100 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.911147 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.911311 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:32:59Z","lastTransitionTime":"2025-09-30T19:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.924305 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.942573 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.956726 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.973741 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:32:59 crc kubenswrapper[4553]: I0930 19:32:59.987789 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:32:59Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.014520 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.014612 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.014636 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.014661 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.014679 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.024178 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:00Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.049180 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:00Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.064562 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:00Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.117666 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.117758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.117779 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.117802 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.117820 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.220851 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.220884 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.220895 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.220910 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.220921 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.322897 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.322940 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.322950 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.322965 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.322976 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.426399 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.426465 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.426482 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.426508 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.426528 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.529451 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.529751 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.529883 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.530009 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.530233 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.633372 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.633448 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.633472 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.633504 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.633526 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.735937 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.735976 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.735993 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.736020 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.736082 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.838734 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.838777 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.838792 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.838812 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.838825 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.941247 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.941326 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.941341 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.941394 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:00 crc kubenswrapper[4553]: I0930 19:33:00.941409 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:00Z","lastTransitionTime":"2025-09-30T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.044101 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.044190 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.044218 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.044250 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.044274 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.146592 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.146647 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.146664 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.146689 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.146705 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.249565 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.249825 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.249941 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.250134 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.250279 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.253672 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp"] Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.254342 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.256301 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.257401 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.269405 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.282465 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.297059 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.309279 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.314421 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/884f4a42-261c-4547-95da-20ba542ce60b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.314638 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/884f4a42-261c-4547-95da-20ba542ce60b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.314859 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/884f4a42-261c-4547-95da-20ba542ce60b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.315029 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggkbl\" (UniqueName: \"kubernetes.io/projected/884f4a42-261c-4547-95da-20ba542ce60b-kube-api-access-ggkbl\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.325933 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.342312 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.353773 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.354030 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.354196 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.354322 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.354457 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.359719 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.374476 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.389954 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.403407 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.415318 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.415519 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/884f4a42-261c-4547-95da-20ba542ce60b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.415570 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggkbl\" (UniqueName: \"kubernetes.io/projected/884f4a42-261c-4547-95da-20ba542ce60b-kube-api-access-ggkbl\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.415591 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/884f4a42-261c-4547-95da-20ba542ce60b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.415613 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/884f4a42-261c-4547-95da-20ba542ce60b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.417984 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/884f4a42-261c-4547-95da-20ba542ce60b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.418491 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/884f4a42-261c-4547-95da-20ba542ce60b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.430698 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/884f4a42-261c-4547-95da-20ba542ce60b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.438124 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.441779 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggkbl\" (UniqueName: \"kubernetes.io/projected/884f4a42-261c-4547-95da-20ba542ce60b-kube-api-access-ggkbl\") pod \"ovnkube-control-plane-749d76644c-5szqp\" (UID: \"884f4a42-261c-4547-95da-20ba542ce60b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.450387 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.456779 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.456825 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.456838 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.456858 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.456874 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.468118 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.483489 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.496103 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.503635 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.503640 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.503760 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:01 crc kubenswrapper[4553]: E0930 19:33:01.503904 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.503933 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:01 crc kubenswrapper[4553]: E0930 19:33:01.504106 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:01 crc kubenswrapper[4553]: E0930 19:33:01.504279 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:01 crc kubenswrapper[4553]: E0930 19:33:01.504284 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.520720 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:01Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.559843 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.559890 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.559905 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.559925 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.559939 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.567070 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" Sep 30 19:33:01 crc kubenswrapper[4553]: W0930 19:33:01.585598 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884f4a42_261c_4547_95da_20ba542ce60b.slice/crio-d5421cb7c254a6ce1499bd57f7138b2fe6e547e5e22817fc2b856f9b521d2e5e WatchSource:0}: Error finding container d5421cb7c254a6ce1499bd57f7138b2fe6e547e5e22817fc2b856f9b521d2e5e: Status 404 returned error can't find the container with id d5421cb7c254a6ce1499bd57f7138b2fe6e547e5e22817fc2b856f9b521d2e5e Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.662250 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.662288 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.662301 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.662320 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.662334 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.764270 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.764303 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.764314 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.764332 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.764343 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.785646 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" event={"ID":"884f4a42-261c-4547-95da-20ba542ce60b","Type":"ContainerStarted","Data":"d5421cb7c254a6ce1499bd57f7138b2fe6e547e5e22817fc2b856f9b521d2e5e"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.866850 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.866928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.866949 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.866972 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.866989 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.969115 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.969165 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.969177 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.969199 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:01 crc kubenswrapper[4553]: I0930 19:33:01.969214 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:01Z","lastTransitionTime":"2025-09-30T19:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.072000 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.072064 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.072073 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.072087 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.072096 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.174050 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.174080 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.174088 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.174101 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.174117 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.276838 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.276885 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.276895 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.276912 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.276924 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.379812 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.379843 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.379852 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.379867 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.379877 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.482096 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.482140 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.482152 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.482170 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.482182 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.584655 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.584698 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.584710 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.584733 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.584745 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.687747 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.687788 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.687804 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.687826 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.687839 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.789235 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.789506 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.789574 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.789655 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.789742 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.789802 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" event={"ID":"884f4a42-261c-4547-95da-20ba542ce60b","Type":"ContainerStarted","Data":"9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.789986 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" event={"ID":"884f4a42-261c-4547-95da-20ba542ce60b","Type":"ContainerStarted","Data":"3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.801969 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.812022 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.832944 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.844992 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.860128 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.871501 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.885601 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.892276 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.892316 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.892324 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.892353 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.892362 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.895722 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.905054 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.925299 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.937895 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.949091 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.966234 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.974212 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.984237 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:02Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.995000 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.995077 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.995089 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.995107 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:02 crc kubenswrapper[4553]: I0930 19:33:02.995119 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:02Z","lastTransitionTime":"2025-09-30T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.011837 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.038672 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.097648 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.097698 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.097709 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.097725 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.097736 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.153216 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.153328 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.153385 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:33:19.153367191 +0000 UTC m=+52.352869321 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.153427 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.153490 4553 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.153530 4553 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.153568 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:19.153555586 +0000 UTC m=+52.353057716 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.153593 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:19.153586777 +0000 UTC m=+52.353088907 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.200664 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.200711 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.200724 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.200741 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.200755 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.254329 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.254396 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.254536 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.254575 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.254587 4553 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.254541 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.254627 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.254634 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:19.254621132 +0000 UTC m=+52.454123262 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.254639 4553 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.254710 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:19.254692364 +0000 UTC m=+52.454194544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.302376 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.302413 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.302421 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.302435 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.302446 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.404383 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.404414 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.404424 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.404436 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.404445 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.456241 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.456451 4553 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.456586 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs podName:584c5bac-180e-46de-8e53-6586f27f2cea nodeName:}" failed. No retries permitted until 2025-09-30 19:33:19.456544089 +0000 UTC m=+52.656046239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs") pod "network-metrics-daemon-swqk9" (UID: "584c5bac-180e-46de-8e53-6586f27f2cea") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.503318 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.503401 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.503329 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.503340 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.503470 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.503599 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.503665 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:03 crc kubenswrapper[4553]: E0930 19:33:03.504108 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.504485 4553 scope.go:117] "RemoveContainer" containerID="a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.506431 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.506464 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.506475 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.506490 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.506502 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.608945 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.608981 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.608992 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.609008 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.609023 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.711401 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.711438 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.711450 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.711468 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.711480 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.794940 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.796552 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.797118 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.813281 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.813316 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.813324 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.813337 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.813352 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.814884 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.827544 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.838053 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.854891 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.865530 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.877210 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.887174 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.903469 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.915526 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.915576 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.915588 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.915605 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.915617 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:03Z","lastTransitionTime":"2025-09-30T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.917558 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.928466 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.940111 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.952590 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.963603 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.973767 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.984088 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:03 crc kubenswrapper[4553]: I0930 19:33:03.997343 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:03Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.017213 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.017260 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.017268 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.017281 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.017293 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.045687 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:04Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.119919 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.119963 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.119975 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.119992 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.120005 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.221838 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.221877 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.221885 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.221899 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.221908 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.323929 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.323969 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.323977 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.323991 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.324001 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.426375 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.426418 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.426429 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.426447 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.426462 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.528654 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.528797 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.528808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.528822 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.528831 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.631497 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.631553 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.631565 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.631580 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.631590 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.733620 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.733645 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.733654 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.733666 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.733674 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.836289 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.836321 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.836329 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.836343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.836351 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.938920 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.938985 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.939008 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.939069 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:04 crc kubenswrapper[4553]: I0930 19:33:04.939087 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:04Z","lastTransitionTime":"2025-09-30T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.041375 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.041475 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.041501 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.041530 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.041553 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.144113 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.144145 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.144170 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.144183 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.144192 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.246534 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.246578 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.246592 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.246611 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.246626 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.348757 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.348794 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.348805 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.348820 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.348831 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.450949 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.451064 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.451080 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.451099 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.451113 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.503794 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.503871 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.503892 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.504055 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:05 crc kubenswrapper[4553]: E0930 19:33:05.504032 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:05 crc kubenswrapper[4553]: E0930 19:33:05.504319 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:05 crc kubenswrapper[4553]: E0930 19:33:05.504328 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:05 crc kubenswrapper[4553]: E0930 19:33:05.504388 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.553594 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.553642 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.553657 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.553676 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.553690 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.656926 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.656992 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.657016 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.657090 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.657120 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.758848 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.758983 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.759011 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.759025 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.759034 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.861514 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.861557 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.861571 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.861590 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.861605 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.963955 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.964000 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.964011 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.964028 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:05 crc kubenswrapper[4553]: I0930 19:33:05.964058 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:05Z","lastTransitionTime":"2025-09-30T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.066394 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.066467 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.066484 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.067014 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.067331 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.170864 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.170964 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.170992 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.171029 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.171110 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.274853 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.274933 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.274982 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.275016 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.275075 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.379376 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.379425 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.379442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.379466 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.379482 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.483132 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.483160 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.483170 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.483184 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.483197 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.585961 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.586006 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.586022 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.586084 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.586102 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.688802 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.688834 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.688842 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.688856 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.688864 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.791431 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.791492 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.791512 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.791540 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.791560 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.894610 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.894831 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.894848 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.894869 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.894885 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.997717 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.997763 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.997780 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.997802 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:06 crc kubenswrapper[4553]: I0930 19:33:06.997819 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:06Z","lastTransitionTime":"2025-09-30T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.100944 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.101003 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.101020 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.101077 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.101106 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.203339 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.203393 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.203410 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.203432 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.203448 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.308161 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.308235 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.308258 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.308288 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.308311 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.411442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.411510 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.411535 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.411562 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.411582 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.503786 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.503938 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.503985 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.504023 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.504104 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.504270 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.504281 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.504383 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.513648 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.513679 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.513698 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.513716 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.513728 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.523428 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.523481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.523493 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.523512 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.523525 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.533485 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.542602 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.546592 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.546631 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.546642 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.546658 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.546672 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.549470 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.570964 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.571625 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.576761 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.576797 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.576812 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.576833 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.576847 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.594681 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.598161 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.599149 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.599197 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.599210 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.599228 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.599242 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.611917 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.614251 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.616382 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.616442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.616454 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.616471 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.616482 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.628203 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.633625 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: E0930 19:33:07.633835 4553 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.635505 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.635558 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.635572 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.635593 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.635610 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.657689 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.682686 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.700943 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.713405 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.731317 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.738265 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.738326 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.738342 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.738360 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.738374 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.746568 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.760833 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.791583 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.813319 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.835285 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.841353 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.841382 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.841393 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.841409 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.841419 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.847758 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:07Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.944702 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.944744 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.944756 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.944773 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:07 crc kubenswrapper[4553]: I0930 19:33:07.944784 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:07Z","lastTransitionTime":"2025-09-30T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.047712 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.047757 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.047765 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.047780 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.047790 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.151446 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.151499 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.151511 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.151527 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.151538 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.255131 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.255184 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.255206 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.255233 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.255253 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.358903 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.359078 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.359109 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.359138 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.359173 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.461257 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.461316 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.461340 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.461369 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.461391 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.563930 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.563962 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.563971 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.563984 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.563993 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.666750 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.666794 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.666806 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.666824 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.666838 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.768759 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.768789 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.768797 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.768809 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.768817 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.870456 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.870518 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.870540 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.870567 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.870588 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.972868 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.972930 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.972953 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.972979 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:08 crc kubenswrapper[4553]: I0930 19:33:08.973004 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:08Z","lastTransitionTime":"2025-09-30T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.075174 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.075207 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.075218 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.075231 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.075239 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.179649 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.179682 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.179690 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.179703 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.179712 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.281817 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.281853 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.281864 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.281879 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.281890 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.384509 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.384539 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.384549 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.384563 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.384573 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.486845 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.486889 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.486898 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.486913 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.486923 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.503346 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.503384 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.503450 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.503460 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:09 crc kubenswrapper[4553]: E0930 19:33:09.503615 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:09 crc kubenswrapper[4553]: E0930 19:33:09.504333 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:09 crc kubenswrapper[4553]: E0930 19:33:09.504376 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:09 crc kubenswrapper[4553]: E0930 19:33:09.504446 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.589428 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.589466 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.589477 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.589491 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.589500 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.692391 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.692447 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.692462 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.692488 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.692507 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.794668 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.794701 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.794710 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.794724 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.794733 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.897254 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.897291 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.897298 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.897311 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.897320 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:09Z","lastTransitionTime":"2025-09-30T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.999115 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.999462 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:09 crc kubenswrapper[4553]: I0930 19:33:09.999757 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.000208 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.000632 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.103175 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.104131 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.104373 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.104658 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.104962 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.207866 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.207907 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.207916 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.207928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.207938 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.311586 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.311638 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.311655 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.311678 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.311694 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.414091 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.414141 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.414153 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.414175 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.414188 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.517028 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.517112 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.517128 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.517150 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.517168 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.619832 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.619886 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.619903 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.619925 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.619942 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.723123 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.723167 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.723177 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.723197 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.723209 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.825408 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.825447 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.825456 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.825477 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.825507 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.928905 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.928988 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.929012 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.929087 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:10 crc kubenswrapper[4553]: I0930 19:33:10.929110 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:10Z","lastTransitionTime":"2025-09-30T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.031518 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.031568 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.031580 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.031600 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.031613 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.135257 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.135309 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.135327 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.135351 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.135370 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.238267 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.238368 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.238388 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.238413 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.238430 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.341550 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.341636 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.341665 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.341700 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.341725 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.445231 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.445291 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.445308 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.445335 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.445351 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.503372 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.503394 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.503412 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:11 crc kubenswrapper[4553]: E0930 19:33:11.503494 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.503532 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:11 crc kubenswrapper[4553]: E0930 19:33:11.503638 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:11 crc kubenswrapper[4553]: E0930 19:33:11.503853 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:11 crc kubenswrapper[4553]: E0930 19:33:11.503980 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.547636 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.547708 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.547728 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.547757 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.547776 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.650376 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.650444 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.650463 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.650487 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.650507 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.753646 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.753964 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.754189 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.754442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.754664 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.858128 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.858192 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.858210 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.858234 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.858251 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.961599 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.961658 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.961681 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.961713 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:11 crc kubenswrapper[4553]: I0930 19:33:11.961739 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:11Z","lastTransitionTime":"2025-09-30T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.064723 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.064775 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.064794 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.064819 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.064836 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.167464 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.167514 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.167531 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.167602 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.167623 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.271029 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.271119 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.271136 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.271200 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.271218 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.374437 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.374511 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.374534 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.374564 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.374586 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.478326 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.478381 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.478398 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.478423 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.478439 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.505539 4553 scope.go:117] "RemoveContainer" containerID="4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.581806 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.582105 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.582117 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.582134 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.582148 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.684993 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.685027 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.685253 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.685291 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.685304 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.788076 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.788112 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.788123 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.788143 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.788155 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.841285 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/1.log" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.845513 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.845618 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.872670 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:12Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.887404 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:12Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.890204 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.890250 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.890263 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.890279 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.890289 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.913695 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:12Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.934204 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:12Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.960706 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:12Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.984384 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:12Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.996835 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:12Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.998023 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.998058 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.998066 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.998081 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:12 crc kubenswrapper[4553]: I0930 19:33:12.998090 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:12Z","lastTransitionTime":"2025-09-30T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.009476 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.026441 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.046750 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.064540 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.077294 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.086760 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.098443 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.099922 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.099955 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.099968 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.099986 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.099997 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.108537 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.121952 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.132226 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.202523 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.202553 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.202574 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.202587 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.202595 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.304969 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.305025 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.305083 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.305127 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.305150 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.408493 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.408543 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.408559 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.408581 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.408599 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.503141 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.503192 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.503301 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.503533 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:13 crc kubenswrapper[4553]: E0930 19:33:13.503537 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:13 crc kubenswrapper[4553]: E0930 19:33:13.503713 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:13 crc kubenswrapper[4553]: E0930 19:33:13.503836 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:13 crc kubenswrapper[4553]: E0930 19:33:13.504007 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.511125 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.511167 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.511184 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.511237 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.511256 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.614361 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.614448 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.614473 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.614510 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.614529 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.717632 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.717675 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.717684 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.717698 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.717706 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.820423 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.820468 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.820481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.820498 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.820510 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.852992 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/2.log" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.854172 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/1.log" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.859565 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60" exitCode=1 Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.859625 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.859675 4553 scope.go:117] "RemoveContainer" containerID="4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.862003 4553 scope.go:117] "RemoveContainer" containerID="e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60" Sep 30 19:33:13 crc kubenswrapper[4553]: E0930 19:33:13.862575 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.882853 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.897270 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.911376 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.922254 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.922286 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.922296 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.922313 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.922327 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:13Z","lastTransitionTime":"2025-09-30T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.924867 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.938612 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.950403 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.965583 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.980337 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:13 crc kubenswrapper[4553]: I0930 19:33:13.994662 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.018679 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:14Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.024765 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.024799 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.024809 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.024865 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.024882 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.041774 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:14Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.063608 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:14Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.083864 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:14Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.108816 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:14Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.127942 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.128016 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.128070 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.128111 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.128133 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.128583 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:14Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.154214 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:14Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.180665 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:14Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.230459 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.230495 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.230503 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.230517 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.230524 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.333758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.333797 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.333807 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.333823 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.333832 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.437401 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.437452 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.437466 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.437488 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.437502 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.540296 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.540348 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.540365 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.540389 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.540407 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.643484 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.643531 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.643541 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.643557 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.643571 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.746227 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.746258 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.746268 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.746281 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.746291 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.849017 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.849084 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.849094 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.849109 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.849118 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.864108 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/2.log" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.952557 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.952587 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.952597 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.952612 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:14 crc kubenswrapper[4553]: I0930 19:33:14.952621 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:14Z","lastTransitionTime":"2025-09-30T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.055372 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.055728 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.055753 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.055777 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.055793 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.149799 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.157240 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.157275 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.157286 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.157300 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.157312 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.163394 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.173584 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.189169 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.200018 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.200031 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.200929 4553 scope.go:117] "RemoveContainer" containerID="e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60" Sep 30 19:33:15 crc kubenswrapper[4553]: E0930 19:33:15.201131 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.212686 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.225334 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.236734 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.251064 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.259464 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.259505 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.259516 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.259533 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.259544 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.268177 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.282770 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.308165 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d0fdf0804b20b130ad6ee0e3f499bba5ca618d06c9af5652729e22f0ec433e6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:32:58Z\\\",\\\"message\\\":\\\"86854 5894 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI0930 19:32:58.485583 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486879 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 19:32:58.486885 5894 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 19:32:58.486901 5894 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI0930 19:32:58.485665 5894 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.487095 5894 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-vzlwd\\\\nI0930 19:32:58.485947 5894 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.348737 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.361306 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.361335 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.361343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.361357 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.361389 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.367691 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.382229 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.399218 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.414366 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.431259 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.443727 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.463436 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.464412 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.464964 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.465183 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.465341 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.465535 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.478951 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.497829 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.503643 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:15 crc kubenswrapper[4553]: E0930 19:33:15.503784 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.503975 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:15 crc kubenswrapper[4553]: E0930 19:33:15.504116 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.504535 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:15 crc kubenswrapper[4553]: E0930 19:33:15.504629 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.504703 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:15 crc kubenswrapper[4553]: E0930 19:33:15.504776 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.528124 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.541977 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.553253 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.564763 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.569072 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.569122 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.569135 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.569152 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.569164 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.577487 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.589151 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.604279 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.614888 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.638935 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.658578 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.672023 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.672099 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.672111 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.672129 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.672140 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.672839 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.686781 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.704922 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.717142 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.729730 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:15Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.774600 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.774664 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.774684 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.774711 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.774727 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.877611 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.877649 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.877659 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.877673 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.877682 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.979901 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.979941 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.979952 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.979969 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:15 crc kubenswrapper[4553]: I0930 19:33:15.979981 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:15Z","lastTransitionTime":"2025-09-30T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.082538 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.082591 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.082609 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.082633 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.082649 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.185865 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.185920 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.185960 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.185977 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.185990 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.288754 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.288820 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.288844 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.288875 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.288898 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.391818 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.391875 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.391891 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.391917 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.391934 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.494685 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.494746 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.494763 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.494786 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.494804 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.597204 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.597255 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.597271 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.597293 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.597312 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.699941 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.699979 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.699988 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.700002 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.700011 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.802505 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.802549 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.802560 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.802577 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.802589 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.914450 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.914489 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.914501 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.914517 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:16 crc kubenswrapper[4553]: I0930 19:33:16.914529 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:16Z","lastTransitionTime":"2025-09-30T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.017487 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.017593 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.017611 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.017639 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.017658 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.120004 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.120104 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.120123 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.120147 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.120167 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.223319 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.223372 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.223390 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.223413 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.223432 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.326249 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.326312 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.326330 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.326353 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.326371 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.429594 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.429677 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.429698 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.429723 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.429745 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.503423 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.503456 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:17 crc kubenswrapper[4553]: E0930 19:33:17.503749 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.505212 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.505257 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:17 crc kubenswrapper[4553]: E0930 19:33:17.505355 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:17 crc kubenswrapper[4553]: E0930 19:33:17.505421 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:17 crc kubenswrapper[4553]: E0930 19:33:17.505489 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.527879 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.533571 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.533640 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.533661 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.533689 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.533807 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.548115 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.572314 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.594082 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.615383 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.632448 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.636461 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.636500 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.636510 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.636530 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.636542 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.648071 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.662581 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.675258 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.699676 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.715842 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.727664 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.738976 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.739005 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.739015 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.739030 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.739060 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.756492 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.770966 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.778767 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.787619 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.805139 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.818868 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.830419 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.841298 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.841654 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.841668 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.841699 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.841711 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.850429 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.864251 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.878282 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.889340 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.902622 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.913813 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.925152 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.940857 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.944857 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.944908 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.944930 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.944956 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.944976 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.955952 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.967663 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.980745 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.994304 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.994357 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.994372 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.994390 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.994402 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:17Z","lastTransitionTime":"2025-09-30T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:17 crc kubenswrapper[4553]: I0930 19:33:17.994935 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:17Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: E0930 19:33:18.011133 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.014474 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.015619 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.015648 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.015659 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.015676 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.015688 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: E0930 19:33:18.028863 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.032485 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.032516 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.032527 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.032542 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.032555 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.033344 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: E0930 19:33:18.046090 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.049196 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.049226 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.049237 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.049252 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.049265 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.052195 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: E0930 19:33:18.061886 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.064727 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.064764 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.064774 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.064791 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.064805 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.066816 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: E0930 19:33:18.078745 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: E0930 19:33:18.078865 4553 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.080332 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.080375 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.080389 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.080408 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.080420 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.090277 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.112070 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:18Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.183979 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.184026 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.184075 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.184099 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.184117 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.286962 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.287013 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.287030 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.287094 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.287112 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.388732 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.388782 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.388791 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.388803 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.388813 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.492028 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.492124 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.492142 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.492169 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.492194 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.595052 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.595099 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.595107 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.595119 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.595129 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.697793 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.697861 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.697887 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.697920 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.697946 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.801543 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.801613 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.801630 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.801659 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.801679 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.909537 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.909617 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.909642 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.909671 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:18 crc kubenswrapper[4553]: I0930 19:33:18.909695 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:18Z","lastTransitionTime":"2025-09-30T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.013195 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.013276 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.013291 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.013317 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.013334 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.116314 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.116377 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.116398 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.116422 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.116437 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.218399 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.218485 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:33:51.218463296 +0000 UTC m=+84.417965426 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.218828 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.218881 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.218935 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.218951 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.218975 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.218994 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.218911 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.219270 4553 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.219378 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:51.219365321 +0000 UTC m=+84.418867451 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.219497 4553 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.219573 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:51.219566096 +0000 UTC m=+84.419068226 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.320687 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.320745 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.320868 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.320919 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.320930 4553 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.320972 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:51.32095838 +0000 UTC m=+84.520460500 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.320950 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.321003 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.321019 4553 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.321124 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 19:33:51.321101774 +0000 UTC m=+84.520603994 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.321779 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.321871 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.321915 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.321937 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.321950 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.425362 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.425436 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.425455 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.425478 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.425496 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.503169 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.503200 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.503265 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.503298 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.503376 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.503459 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.503522 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.503565 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.522446 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.522641 4553 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: E0930 19:33:19.522721 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs podName:584c5bac-180e-46de-8e53-6586f27f2cea nodeName:}" failed. No retries permitted until 2025-09-30 19:33:51.522701391 +0000 UTC m=+84.722203521 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs") pod "network-metrics-daemon-swqk9" (UID: "584c5bac-180e-46de-8e53-6586f27f2cea") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.528000 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.528092 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.528114 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.528141 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.528163 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.630511 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.630572 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.630584 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.630605 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.630619 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.733283 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.733349 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.733361 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.733379 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.733392 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.837117 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.837195 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.837213 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.837236 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.837251 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.941356 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.941438 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.941459 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.941490 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:19 crc kubenswrapper[4553]: I0930 19:33:19.941511 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:19Z","lastTransitionTime":"2025-09-30T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.044460 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.044511 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.044526 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.044549 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.044567 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.147788 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.147834 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.147847 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.147865 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.147877 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.251913 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.251982 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.252096 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.252138 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.252161 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.355750 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.355800 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.355813 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.355831 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.355841 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.458442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.459092 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.459173 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.459259 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.459343 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.562151 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.562629 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.562702 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.562779 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.562858 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.666973 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.667344 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.667485 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.667601 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.667716 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.771202 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.771571 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.771648 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.771725 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.771827 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.876715 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.877030 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.877177 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.877350 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.877485 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.980766 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.980837 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.980859 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.980894 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:20 crc kubenswrapper[4553]: I0930 19:33:20.980919 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:20Z","lastTransitionTime":"2025-09-30T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.084703 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.084788 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.084812 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.084839 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.084858 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.187455 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.187762 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.187835 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.187898 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.187999 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.290692 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.290725 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.290735 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.290749 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.290761 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.392990 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.393272 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.393340 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.393403 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.393467 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.496710 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.496804 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.496832 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.496862 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.496887 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.503821 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.503910 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:21 crc kubenswrapper[4553]: E0930 19:33:21.504198 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.504249 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:21 crc kubenswrapper[4553]: E0930 19:33:21.504560 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:21 crc kubenswrapper[4553]: E0930 19:33:21.504614 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.504734 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:21 crc kubenswrapper[4553]: E0930 19:33:21.504939 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.599980 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.600026 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.600059 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.600072 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.600080 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.702262 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.702625 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.702709 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.702801 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.702881 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.805469 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.805517 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.805528 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.805545 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.805557 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.907351 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.907401 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.907411 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.907425 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:21 crc kubenswrapper[4553]: I0930 19:33:21.907436 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:21Z","lastTransitionTime":"2025-09-30T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.009904 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.009935 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.009944 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.009958 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.009968 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.113390 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.113497 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.113521 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.113552 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.113574 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.215871 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.215935 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.215958 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.215985 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.216006 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.319883 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.320294 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.320513 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.320724 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.321097 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.424554 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.424612 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.424630 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.424655 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.424673 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.527845 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.527908 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.527923 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.527946 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.527961 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.630919 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.630961 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.630990 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.631004 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.631014 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.734030 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.734336 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.734421 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.734498 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.734563 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.837412 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.837450 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.837459 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.837472 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.837483 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.939937 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.939977 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.939990 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.940006 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:22 crc kubenswrapper[4553]: I0930 19:33:22.940015 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:22Z","lastTransitionTime":"2025-09-30T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.043077 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.043289 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.043372 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.043453 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.043540 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.146765 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.146817 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.146833 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.146856 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.146874 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.249288 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.249364 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.249389 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.249419 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.249443 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.351830 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.351879 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.351892 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.351911 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.351924 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.454302 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.454540 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.454608 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.454670 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.454745 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.502978 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:23 crc kubenswrapper[4553]: E0930 19:33:23.503152 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.503289 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.503332 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.503347 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:23 crc kubenswrapper[4553]: E0930 19:33:23.503479 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:23 crc kubenswrapper[4553]: E0930 19:33:23.503604 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:23 crc kubenswrapper[4553]: E0930 19:33:23.503727 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.556678 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.557687 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.557897 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.558166 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.558358 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.661182 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.661397 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.661484 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.661549 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.661619 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.764013 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.764376 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.764516 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.764676 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.764793 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.867442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.867510 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.867528 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.867554 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.867575 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.971975 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.972376 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.972520 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.972677 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:23 crc kubenswrapper[4553]: I0930 19:33:23.972805 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:23Z","lastTransitionTime":"2025-09-30T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.076157 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.076207 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.076234 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.076253 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.076266 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.178481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.178526 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.178540 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.178559 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.178572 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.281024 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.281155 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.281177 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.281207 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.281227 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.383368 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.383414 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.383425 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.383441 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.383455 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.486387 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.486453 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.486466 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.486481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.486492 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.588390 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.588424 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.588435 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.588456 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.588467 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.690857 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.690899 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.690907 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.690925 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.690935 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.792841 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.792896 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.792907 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.792927 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.792967 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.895440 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.895512 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.895524 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.895570 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.895585 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.997630 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.997668 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.997678 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.997693 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:24 crc kubenswrapper[4553]: I0930 19:33:24.997704 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:24Z","lastTransitionTime":"2025-09-30T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.100461 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.100542 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.100562 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.100594 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.100613 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.203009 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.203087 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.203099 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.203120 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.203132 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.305653 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.305734 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.305755 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.305778 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.305796 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.408118 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.408191 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.408289 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.408316 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.408380 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.503767 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.503767 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:25 crc kubenswrapper[4553]: E0930 19:33:25.503981 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.503796 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:25 crc kubenswrapper[4553]: E0930 19:33:25.504157 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.503792 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:25 crc kubenswrapper[4553]: E0930 19:33:25.504400 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:25 crc kubenswrapper[4553]: E0930 19:33:25.504256 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.511433 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.511454 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.511463 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.511478 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.511488 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.614311 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.615241 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.615470 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.615775 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.615953 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.718966 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.719073 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.719091 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.719125 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.719142 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.822184 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.822249 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.822267 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.822289 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.822306 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.925788 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.925837 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.925850 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.925871 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:25 crc kubenswrapper[4553]: I0930 19:33:25.925885 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:25Z","lastTransitionTime":"2025-09-30T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.029179 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.029250 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.029292 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.029322 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.029338 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.132167 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.132235 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.132249 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.132274 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.132290 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.234794 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.235006 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.235153 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.235224 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.235286 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.338568 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.338796 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.338862 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.338928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.339032 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.441946 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.442012 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.442030 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.442092 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.442110 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.544719 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.544781 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.544792 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.544805 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.544814 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.648273 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.648348 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.648366 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.648395 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.648416 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.751784 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.751853 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.751871 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.751896 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.751914 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.855130 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.855207 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.855223 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.855248 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.855263 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.958541 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.958637 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.958663 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.958699 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:26 crc kubenswrapper[4553]: I0930 19:33:26.958731 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:26Z","lastTransitionTime":"2025-09-30T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.061902 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.061962 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.061976 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.062000 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.062015 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.165958 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.166494 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.166607 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.166761 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.166937 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.270315 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.270389 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.270411 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.270446 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.270470 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.374358 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.374419 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.374435 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.374459 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.374474 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.483273 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.483337 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.483356 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.483388 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.483410 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.503159 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:27 crc kubenswrapper[4553]: E0930 19:33:27.503879 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.503237 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:27 crc kubenswrapper[4553]: E0930 19:33:27.505170 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.503235 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:27 crc kubenswrapper[4553]: E0930 19:33:27.505400 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.503379 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:27 crc kubenswrapper[4553]: E0930 19:33:27.506654 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.519378 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.538714 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.556210 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.577398 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.585698 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.585740 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.585752 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.585772 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.585790 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.597622 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.617794 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.655980 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.677309 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.689135 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.689190 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.689205 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.689228 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.689242 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.695546 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.717665 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.732211 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.745491 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.755803 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.769744 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.781243 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.792365 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.792468 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.792491 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.792699 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.792714 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.794784 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.806142 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.830777 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:27Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.896110 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.896162 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.896174 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.896195 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:27 crc kubenswrapper[4553]: I0930 19:33:27.896211 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:27Z","lastTransitionTime":"2025-09-30T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.000871 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.000928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.000942 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.000972 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.000988 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.103548 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.103595 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.103606 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.103623 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.103635 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.206188 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.206217 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.206227 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.206239 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.206248 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.235809 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.235828 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.235836 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.235847 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.235854 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: E0930 19:33:28.247723 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:28Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.251767 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.251858 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.251881 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.251913 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.251934 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: E0930 19:33:28.272080 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:28Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.277733 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.277791 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.277808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.277827 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.278219 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: E0930 19:33:28.296945 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:28Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.300739 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.300763 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.300774 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.300787 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.300797 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: E0930 19:33:28.322275 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:28Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.326231 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.326264 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.326273 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.326315 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.326325 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: E0930 19:33:28.343014 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:28Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:28 crc kubenswrapper[4553]: E0930 19:33:28.343131 4553 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.344314 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.344332 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.344343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.344358 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.344367 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.446275 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.446303 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.446311 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.446323 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.446331 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.549850 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.549914 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.549939 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.549967 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.549988 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.654100 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.654150 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.654167 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.654188 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.654204 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.760351 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.760451 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.760470 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.760503 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.760529 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.864228 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.864304 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.864396 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.864438 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.864466 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.968024 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.968142 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.968163 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.968568 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:28 crc kubenswrapper[4553]: I0930 19:33:28.968636 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:28Z","lastTransitionTime":"2025-09-30T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.073084 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.073183 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.073210 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.073245 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.073271 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.176268 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.176320 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.176331 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.176352 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.176366 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.279913 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.280177 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.280257 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.280333 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.280395 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.384742 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.384781 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.384789 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.384803 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.384815 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.486713 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.486995 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.487140 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.487247 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.487339 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.503844 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.503882 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.503860 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:29 crc kubenswrapper[4553]: E0930 19:33:29.503975 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.504009 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:29 crc kubenswrapper[4553]: E0930 19:33:29.504133 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:29 crc kubenswrapper[4553]: E0930 19:33:29.504235 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:29 crc kubenswrapper[4553]: E0930 19:33:29.504302 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.589789 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.589831 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.589839 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.589854 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.589867 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.692405 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.692440 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.692448 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.692461 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.692470 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.795358 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.795398 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.795409 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.795425 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.795437 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.899054 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.899099 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.899108 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.899124 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:29 crc kubenswrapper[4553]: I0930 19:33:29.899137 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:29Z","lastTransitionTime":"2025-09-30T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.001793 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.001866 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.001885 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.001913 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.001934 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.112356 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.112484 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.112540 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.112581 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.112635 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.226264 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.226368 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.226390 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.226418 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.226441 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.329828 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.329873 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.329886 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.329907 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.329920 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.433828 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.433893 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.433918 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.433951 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.433973 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.505338 4553 scope.go:117] "RemoveContainer" containerID="e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60" Sep 30 19:33:30 crc kubenswrapper[4553]: E0930 19:33:30.505758 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.537155 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.537191 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.537203 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.537220 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.537235 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.640678 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.640727 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.640737 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.640758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.640774 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.744090 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.744122 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.744136 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.744156 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.744173 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.846696 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.846793 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.846805 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.846822 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.846833 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.948985 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.949052 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.949063 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.949083 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:30 crc kubenswrapper[4553]: I0930 19:33:30.949095 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:30Z","lastTransitionTime":"2025-09-30T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.052347 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.052386 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.052397 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.052414 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.052426 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.154610 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.155235 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.155280 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.155304 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.155319 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.258233 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.258274 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.258283 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.258298 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.258309 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.361530 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.361587 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.361602 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.361622 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.361635 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.464065 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.464105 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.464114 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.464130 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.464140 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.503928 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.503968 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.504098 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.503949 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:31 crc kubenswrapper[4553]: E0930 19:33:31.504182 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:31 crc kubenswrapper[4553]: E0930 19:33:31.504342 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:31 crc kubenswrapper[4553]: E0930 19:33:31.504441 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:31 crc kubenswrapper[4553]: E0930 19:33:31.504659 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.566885 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.566936 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.566945 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.566966 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.566977 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.669785 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.669829 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.669838 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.669857 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.669874 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.773097 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.773160 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.773171 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.773188 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.773201 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.876473 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.876536 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.876552 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.876575 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.876597 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.979244 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.979301 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.979316 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.979337 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:31 crc kubenswrapper[4553]: I0930 19:33:31.979379 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:31Z","lastTransitionTime":"2025-09-30T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.081832 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.081863 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.081871 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.081885 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.081895 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.184436 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.184498 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.184509 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.184526 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.184540 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.287375 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.287419 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.287430 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.287445 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.287455 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.390243 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.390304 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.390313 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.390326 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.390335 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.493011 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.493083 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.493097 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.493115 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.493126 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.595026 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.595077 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.595086 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.595099 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.595108 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.697397 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.697443 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.697452 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.697469 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.697479 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.799717 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.799787 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.799808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.799841 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.799868 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.902938 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.902992 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.903002 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.903019 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:32 crc kubenswrapper[4553]: I0930 19:33:32.903030 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:32Z","lastTransitionTime":"2025-09-30T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.005887 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.005955 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.005996 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.006019 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.006031 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.110085 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.110569 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.110776 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.110997 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.111245 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.214319 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.214408 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.214429 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.214458 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.214476 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.317323 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.317394 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.317421 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.317458 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.317488 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.420955 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.421018 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.421073 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.421097 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.421112 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.503651 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.503689 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.503743 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.503689 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:33 crc kubenswrapper[4553]: E0930 19:33:33.503844 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:33 crc kubenswrapper[4553]: E0930 19:33:33.504056 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:33 crc kubenswrapper[4553]: E0930 19:33:33.504178 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:33 crc kubenswrapper[4553]: E0930 19:33:33.504283 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.523142 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.523192 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.523204 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.523223 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.523238 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.626152 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.626235 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.626262 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.626298 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.626326 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.729287 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.729340 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.729355 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.729378 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.729395 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.831592 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.831626 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.831638 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.831655 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.831667 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.933994 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.934053 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.934063 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.934081 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:33 crc kubenswrapper[4553]: I0930 19:33:33.934090 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:33Z","lastTransitionTime":"2025-09-30T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.036796 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.036868 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.036892 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.036920 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.036940 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.139408 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.139471 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.139482 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.139508 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.139520 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.241905 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.241948 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.241957 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.241974 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.241983 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.344369 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.344406 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.344416 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.344433 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.344444 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.446671 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.446708 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.446719 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.446735 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.446747 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.548736 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.548793 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.548808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.548829 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.548849 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.651450 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.651487 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.651496 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.651514 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.651524 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.753563 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.753632 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.753645 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.753663 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.753678 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.856597 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.856663 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.856675 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.856698 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.856712 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.960468 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.960520 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.960530 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.960549 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:34 crc kubenswrapper[4553]: I0930 19:33:34.960563 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:34Z","lastTransitionTime":"2025-09-30T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.062635 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.062705 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.062723 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.062750 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.062768 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.165250 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.165306 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.165321 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.165343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.165373 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.268144 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.268186 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.268195 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.268213 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.268223 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.371366 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.371460 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.371477 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.371500 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.371517 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.482808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.482848 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.482859 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.482877 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.482888 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.503127 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.503186 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.503188 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:35 crc kubenswrapper[4553]: E0930 19:33:35.503271 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.503375 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:35 crc kubenswrapper[4553]: E0930 19:33:35.503375 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:35 crc kubenswrapper[4553]: E0930 19:33:35.503475 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:35 crc kubenswrapper[4553]: E0930 19:33:35.503558 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.585843 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.585893 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.585903 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.585923 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.585935 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.689071 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.689125 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.689139 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.689161 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.689173 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.791909 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.791956 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.791969 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.791988 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.791999 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.894599 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.894656 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.894670 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.894691 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.894705 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.937665 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/0.log" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.937725 4553 generic.go:334] "Generic (PLEG): container finished" podID="0d6b9396-3666-49a3-9d06-f764a3b39edf" containerID="f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00" exitCode=1 Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.937763 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzlwd" event={"ID":"0d6b9396-3666-49a3-9d06-f764a3b39edf","Type":"ContainerDied","Data":"f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00"} Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.938318 4553 scope.go:117] "RemoveContainer" containerID="f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.963932 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:35Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.975660 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:35Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.989686 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:35Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.998393 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.998452 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.998466 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.998491 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:35 crc kubenswrapper[4553]: I0930 19:33:35.998502 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:35Z","lastTransitionTime":"2025-09-30T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.002771 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.013591 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.030962 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.045111 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.057109 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.072125 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.086503 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"2025-09-30T19:32:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b\\\\n2025-09-30T19:32:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b to /host/opt/cni/bin/\\\\n2025-09-30T19:32:50Z [verbose] multus-daemon started\\\\n2025-09-30T19:32:50Z [verbose] Readiness Indicator file check\\\\n2025-09-30T19:33:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.099202 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.100802 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.100835 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.100847 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.100867 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.100879 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.113469 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.131916 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.145992 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.165687 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.186074 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.201107 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.203110 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.203158 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.203169 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.203191 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.203203 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.214263 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.307152 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.307211 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.307231 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.307256 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.307275 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.410458 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.410514 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.410532 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.410557 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.410577 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.514771 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.514820 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.514835 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.514860 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.514879 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.619399 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.619458 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.619469 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.619489 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.619500 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.722377 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.722431 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.722449 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.722472 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.722490 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.826399 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.826464 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.826478 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.826500 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.826513 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.930158 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.930223 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.930239 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.930267 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.930282 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:36Z","lastTransitionTime":"2025-09-30T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.952661 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/0.log" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.952718 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzlwd" event={"ID":"0d6b9396-3666-49a3-9d06-f764a3b39edf","Type":"ContainerStarted","Data":"b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93"} Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.967709 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.981468 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:36 crc kubenswrapper[4553]: I0930 19:33:36.998391 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:36Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.014078 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.027391 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.033010 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.033094 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.033107 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.033126 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.033160 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.047053 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.062521 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.080740 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"2025-09-30T19:32:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b\\\\n2025-09-30T19:32:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b to /host/opt/cni/bin/\\\\n2025-09-30T19:32:50Z [verbose] multus-daemon started\\\\n2025-09-30T19:32:50Z [verbose] Readiness Indicator file check\\\\n2025-09-30T19:33:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.100252 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.134313 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.135779 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.135803 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.135811 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.135826 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.135836 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.152550 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.165506 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.191901 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.210234 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.229130 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.238088 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.238142 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.238160 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.238186 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.238207 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.246391 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.261970 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.278929 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.342170 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.342234 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.342250 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.342274 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.342289 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.445105 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.445161 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.445171 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.445193 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.445204 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.503787 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.503902 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.503941 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.503805 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:37 crc kubenswrapper[4553]: E0930 19:33:37.504186 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:37 crc kubenswrapper[4553]: E0930 19:33:37.504371 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:37 crc kubenswrapper[4553]: E0930 19:33:37.504565 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:37 crc kubenswrapper[4553]: E0930 19:33:37.504698 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.523062 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.524323 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.541651 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.547708 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.547904 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.548002 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.548092 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.548183 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.558030 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.570535 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.583354 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.608019 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.621302 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.636583 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"2025-09-30T19:32:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b\\\\n2025-09-30T19:32:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b to /host/opt/cni/bin/\\\\n2025-09-30T19:32:50Z [verbose] multus-daemon started\\\\n2025-09-30T19:32:50Z [verbose] Readiness Indicator file check\\\\n2025-09-30T19:33:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.650987 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.651162 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.651214 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.651228 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.651252 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.651270 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.673028 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.688931 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.701266 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.723635 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.742129 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.754759 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.754808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.754819 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.754837 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.754853 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.755510 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.768841 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.785539 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.799294 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:37Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.857167 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.857189 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.857198 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.857214 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.857224 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.959551 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.959594 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.959603 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.959639 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:37 crc kubenswrapper[4553]: I0930 19:33:37.959650 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:37Z","lastTransitionTime":"2025-09-30T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.063018 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.063071 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.063081 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.063096 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.063108 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.165795 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.165826 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.165836 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.165850 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.165860 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.268269 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.268304 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.268313 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.268327 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.268338 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.372192 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.372248 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.372258 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.372278 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.372294 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.476335 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.476400 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.476415 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.476440 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.476456 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.564308 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.564402 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.564430 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.564468 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.564489 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: E0930 19:33:38.578518 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:38Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.583510 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.583560 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.583583 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.583611 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.583630 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: E0930 19:33:38.596165 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:38Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.600868 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.601010 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.601098 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.601138 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.601231 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: E0930 19:33:38.618714 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:38Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.625122 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.625225 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.625278 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.625303 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.625325 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: E0930 19:33:38.640526 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:38Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.646491 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.646555 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.646579 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.646611 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.646636 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: E0930 19:33:38.666369 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:38Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:38 crc kubenswrapper[4553]: E0930 19:33:38.666612 4553 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.668870 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.668915 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.668933 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.668961 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.668985 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.772147 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.772206 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.772226 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.772254 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.772275 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.874690 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.874715 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.874726 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.874742 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.874754 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.977441 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.977519 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.977540 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.977570 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:38 crc kubenswrapper[4553]: I0930 19:33:38.977591 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:38Z","lastTransitionTime":"2025-09-30T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.079905 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.079950 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.079967 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.079988 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.080003 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.183195 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.183256 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.183276 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.183301 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.183319 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.285695 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.285756 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.285772 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.285797 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.285815 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.388282 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.388330 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.388346 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.388364 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.388377 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.491868 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.491945 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.491965 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.491994 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.492012 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.503686 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.503687 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.504170 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:39 crc kubenswrapper[4553]: E0930 19:33:39.504396 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.504772 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:39 crc kubenswrapper[4553]: E0930 19:33:39.504970 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:39 crc kubenswrapper[4553]: E0930 19:33:39.505381 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:39 crc kubenswrapper[4553]: E0930 19:33:39.505525 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.596144 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.596232 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.596256 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.596289 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.596307 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.698756 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.698873 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.698892 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.698923 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.698942 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.802410 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.802483 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.802503 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.802530 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.802549 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.905403 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.905467 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.905487 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.905513 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:39 crc kubenswrapper[4553]: I0930 19:33:39.905534 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:39Z","lastTransitionTime":"2025-09-30T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.007877 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.007908 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.007919 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.007940 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.007953 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.110734 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.110762 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.110769 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.110784 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.110794 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.212831 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.212873 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.212884 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.212902 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.212913 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.315628 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.315713 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.315733 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.316138 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.316401 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.419444 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.419481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.419492 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.419509 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.419520 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.522501 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.522662 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.522689 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.522719 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.522739 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.625890 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.625962 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.625981 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.626012 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.626035 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.729185 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.729234 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.729246 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.729264 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.729275 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.831757 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.831806 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.831820 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.831842 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.831858 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.934696 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.934734 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.934745 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.934760 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:40 crc kubenswrapper[4553]: I0930 19:33:40.934770 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:40Z","lastTransitionTime":"2025-09-30T19:33:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.037265 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.037337 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.037350 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.037368 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.037381 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.139568 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.140271 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.140353 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.140435 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.140513 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.243336 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.243453 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.243481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.243518 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.243546 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.346352 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.346386 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.346394 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.346408 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.346418 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.449844 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.450222 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.450324 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.450429 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.450510 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.504765 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:41 crc kubenswrapper[4553]: E0930 19:33:41.504952 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.505032 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:41 crc kubenswrapper[4553]: E0930 19:33:41.505140 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.505562 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:41 crc kubenswrapper[4553]: E0930 19:33:41.505682 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.505715 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:41 crc kubenswrapper[4553]: E0930 19:33:41.506666 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.507064 4553 scope.go:117] "RemoveContainer" containerID="e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.555634 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.556346 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.556362 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.556379 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.556395 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.659397 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.659444 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.659456 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.659477 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.659494 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.761362 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.761675 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.761774 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.761928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.762078 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.863923 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.863986 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.864002 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.864021 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.864104 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.965786 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.965828 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.965840 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.965858 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.965870 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:41Z","lastTransitionTime":"2025-09-30T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.970816 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/2.log" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.973232 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c"} Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.974401 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:33:41 crc kubenswrapper[4553]: I0930 19:33:41.988259 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:41Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.009436 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.024079 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.038722 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.051851 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.066472 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"2025-09-30T19:32:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b\\\\n2025-09-30T19:32:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b to /host/opt/cni/bin/\\\\n2025-09-30T19:32:50Z [verbose] multus-daemon started\\\\n2025-09-30T19:32:50Z [verbose] Readiness Indicator file check\\\\n2025-09-30T19:33:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.068338 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.068413 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.068431 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.068486 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.068506 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.097641 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.112460 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.123760 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.141961 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.152690 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.165863 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.171752 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.171786 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.171797 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.171816 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.171828 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.181011 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.201830 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.220063 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.235769 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff8d817f-2918-40e6-92da-79585629fb4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ae6eef7edd538ebef5374a4feae9a8889fb7d671191b3cc3d86f083fc60b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.295680 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.295723 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.295734 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.295752 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.295761 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.309052 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.325602 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.341445 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.397696 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.397768 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.397780 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.397798 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.397811 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.500283 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.500323 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.500337 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.500354 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.500363 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.602602 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.602650 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.602662 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.602682 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.602694 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.704978 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.705016 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.705028 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.705220 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.705232 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.807226 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.807265 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.807275 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.807290 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.807300 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.910448 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.910500 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.910516 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.910534 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.910547 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:42Z","lastTransitionTime":"2025-09-30T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.979637 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/3.log" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.980460 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/2.log" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.983695 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c" exitCode=1 Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.983733 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c"} Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.983771 4553 scope.go:117] "RemoveContainer" containerID="e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.985301 4553 scope.go:117] "RemoveContainer" containerID="58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c" Sep 30 19:33:42 crc kubenswrapper[4553]: E0930 19:33:42.985587 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:33:42 crc kubenswrapper[4553]: I0930 19:33:42.999663 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:42Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.014726 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.014790 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.014804 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.014822 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.014834 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.020296 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.034686 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.050623 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.066306 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"2025-09-30T19:32:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b\\\\n2025-09-30T19:32:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b to /host/opt/cni/bin/\\\\n2025-09-30T19:32:50Z [verbose] multus-daemon started\\\\n2025-09-30T19:32:50Z [verbose] Readiness Indicator file check\\\\n2025-09-30T19:33:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.078442 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.106133 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.117504 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.117577 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.117593 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.117617 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.117633 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.119336 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.130567 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.150830 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0a451cbca007e1d7efe7b7413968fcf4340d624c5ad9b01a9a5cecc78a44e60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:13Z\\\",\\\"message\\\":\\\"e crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:13Z is after 2025-08-24T17:21:41Z]\\\\nI0930 19:33:13.358489 6123 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:42Z\\\",\\\"message\\\":\\\"l\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 19:33:42.671228 6466 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 19:33:42.671933 6466 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-controller-manager-operator for network=default : 2.21187ms\\\\nI0930 19:33:42.671968 6466 services_controller.go:356] Processing sync for service openshift-kube-storage-version-migrator-operator/metrics for network=default\\\\nF0930 19:33:42.672345 6466 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.163909 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.178380 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.190438 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.204932 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.217204 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.220648 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.220697 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.220714 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.220738 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.220755 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.231482 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff8d817f-2918-40e6-92da-79585629fb4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ae6eef7edd538ebef5374a4feae9a8889fb7d671191b3cc3d86f083fc60b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.247572 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.257546 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.271298 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:43Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.323479 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.323529 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.323543 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.323559 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.323943 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.426187 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.426226 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.426240 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.426255 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.426267 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.504019 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.504062 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.504115 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.504131 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:43 crc kubenswrapper[4553]: E0930 19:33:43.504197 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:43 crc kubenswrapper[4553]: E0930 19:33:43.504333 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:43 crc kubenswrapper[4553]: E0930 19:33:43.504463 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:43 crc kubenswrapper[4553]: E0930 19:33:43.504551 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.527963 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.528003 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.528014 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.528029 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.528055 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.630816 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.630868 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.630888 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.630913 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.630930 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.734132 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.734195 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.734209 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.734226 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.734238 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.837115 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.837180 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.837204 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.837233 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.837258 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.939955 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.940000 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.940011 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.940025 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.940060 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:43Z","lastTransitionTime":"2025-09-30T19:33:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.989589 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/3.log" Sep 30 19:33:43 crc kubenswrapper[4553]: I0930 19:33:43.995883 4553 scope.go:117] "RemoveContainer" containerID="58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c" Sep 30 19:33:43 crc kubenswrapper[4553]: E0930 19:33:43.996241 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.009586 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.027825 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.042558 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.042593 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.042605 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.042620 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.042630 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.051698 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"2025-09-30T19:32:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b\\\\n2025-09-30T19:32:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b to /host/opt/cni/bin/\\\\n2025-09-30T19:32:50Z [verbose] multus-daemon started\\\\n2025-09-30T19:32:50Z [verbose] Readiness Indicator file check\\\\n2025-09-30T19:33:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.068564 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.087533 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.100862 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.115354 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.147284 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:42Z\\\",\\\"message\\\":\\\"l\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 19:33:42.671228 6466 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 19:33:42.671933 6466 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-controller-manager-operator for network=default : 2.21187ms\\\\nI0930 19:33:42.671968 6466 services_controller.go:356] Processing sync for service openshift-kube-storage-version-migrator-operator/metrics for network=default\\\\nF0930 19:33:42.672345 6466 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.149439 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.149487 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.149498 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.149516 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.149527 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.177463 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.198332 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.211161 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.224829 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.237972 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.251298 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.252581 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.252671 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.252688 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.252709 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.252722 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.265999 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.278359 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.294416 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.308440 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff8d817f-2918-40e6-92da-79585629fb4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ae6eef7edd538ebef5374a4feae9a8889fb7d671191b3cc3d86f083fc60b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.323571 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:44Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.355745 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.355794 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.355806 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.355826 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.355841 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.458709 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.458758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.458772 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.458792 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.458806 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.561783 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.561824 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.561833 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.561853 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.561865 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.663648 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.663685 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.663693 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.663707 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.663717 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.766437 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.766519 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.766538 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.766565 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.766586 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.869112 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.869140 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.869149 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.869163 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.869173 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.971795 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.971829 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.971839 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.971855 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:44 crc kubenswrapper[4553]: I0930 19:33:44.971866 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:44Z","lastTransitionTime":"2025-09-30T19:33:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.073934 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.073970 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.073979 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.073993 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.074002 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.176877 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.176908 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.176918 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.176933 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.176945 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.280131 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.280285 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.280308 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.280339 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.280361 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.383342 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.383426 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.383442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.383459 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.383472 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.486015 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.486084 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.486097 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.486116 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.486131 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.503425 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.503534 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:45 crc kubenswrapper[4553]: E0930 19:33:45.503588 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.503425 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:45 crc kubenswrapper[4553]: E0930 19:33:45.503732 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.503433 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:45 crc kubenswrapper[4553]: E0930 19:33:45.503853 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:45 crc kubenswrapper[4553]: E0930 19:33:45.503910 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.589839 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.589905 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.589929 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.589962 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.590090 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.693015 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.693082 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.693093 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.693115 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.693126 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.795869 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.795994 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.796017 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.796072 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.796099 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.899431 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.899480 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.899491 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.899509 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:45 crc kubenswrapper[4553]: I0930 19:33:45.899522 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:45Z","lastTransitionTime":"2025-09-30T19:33:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.002017 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.002069 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.002078 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.002095 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.002105 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.105690 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.105761 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.105781 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.105809 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.105830 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.209749 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.209821 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.209844 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.209877 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.209899 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.313665 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.313744 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.313764 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.313790 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.313809 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.417971 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.418078 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.418103 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.418138 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.418156 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.521722 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.521807 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.521827 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.521858 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.521878 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.625650 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.625715 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.625734 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.625758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.625775 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.729610 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.729674 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.729691 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.729719 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.729738 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.833668 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.833749 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.833773 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.833804 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.833832 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.937231 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.937632 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.937641 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.937656 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:46 crc kubenswrapper[4553]: I0930 19:33:46.937665 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:46Z","lastTransitionTime":"2025-09-30T19:33:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.040250 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.040308 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.040322 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.040345 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.040360 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.143857 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.143957 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.143976 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.144033 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.144107 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.247826 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.247973 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.247990 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.248008 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.248021 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.352512 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.352596 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.352620 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.352652 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.352681 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.456559 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.456618 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.456656 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.456696 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.456721 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.504078 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:47 crc kubenswrapper[4553]: E0930 19:33:47.504325 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.504716 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:47 crc kubenswrapper[4553]: E0930 19:33:47.504854 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.505201 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:47 crc kubenswrapper[4553]: E0930 19:33:47.505322 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.505722 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:47 crc kubenswrapper[4553]: E0930 19:33:47.505862 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.543530 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"309a705c-be47-41a5-b9d1-887b86bbcb84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38442f174e887f4ff469c95cfe6f2c1a7065a72f4873d1f3abd3a7c0536742aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65c375917e1325ddbcdf68e4e9e3ac26a5ab4831ae795b512dc109358c1f9061\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65f67281e9e1826d6458a1a6ddb8dca7274ab0e3a7874b80330aca25aaf44ab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5205f9f81b460c81e4830a12fdbbf33883aff05771306c22b80e50e552c11f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffe20d2e2dfd5cd1af4e2dbe87129b87f68cabee03c68058961f664f19d44cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44d380221e1b741561ad636fa38e034df3f1168f68b64e8e8e4dc5ecabadfeed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09db1eef97360b99063f5be9033db2d6365180547252fa19c0810aca89fd0a4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ae87fe02ed55da4968c19b9564c205aa0365a24e7d83eda175e568bdbe1d75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.559973 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.560029 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.560079 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.560107 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.560125 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.572914 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88e51d0b-505c-4388-a989-44aabe0ca78d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3b8b4686cfd3b378f4d900db9747a5e1b978ade133d14ea4294154e88239a73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d867c87a52709bfe2a0b87cfdd8473916524055974869571c5e66454c56d5319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df033e939741e02e5074b01c481afd36c8349cd79c429dc9cc612395cd8e68e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6ca9854c3b2e3cae63b2abbc6c50df275754ea6afa3ecfe6d15a878f09b9454\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0922353015f02610ca6664445ac381af72bb90e57d2a7a266ff89abc2f51082\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 19:32:47.116194 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 19:32:47.116394 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 19:32:47.118119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3311956634/tls.crt::/tmp/serving-cert-3311956634/tls.key\\\\\\\"\\\\nI0930 19:32:47.454883 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 19:32:47.460236 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 19:32:47.460266 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 19:32:47.460295 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 19:32:47.460302 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 19:32:47.469344 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 19:32:47.469370 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469375 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 19:32:47.469381 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 19:32:47.469384 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 19:32:47.469387 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 19:32:47.469390 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 19:32:47.469409 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 19:32:47.471330 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:41Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1347dcfc547f4e7b9285c3d0a3b467a600fb18e5e15762dbcaf37e79b1cb7235\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04057d826ccef150b84357f466961b56df84dfa879816bc3cb9ba69f8b5408ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.591507 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p4qgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff92820c-07f5-4503-8d99-5428f5fbecb8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e7092e0f05b04858be03d1cd2037bfb1729b9518957cbbba28019e9769be229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qgqgj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p4qgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.614429 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4457466e-c6fd-4a2f-8b73-c205c50f90e3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:42Z\\\",\\\"message\\\":\\\"l\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 19:33:42.671228 6466 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager-operator/metrics\\\\\\\"}\\\\nI0930 19:33:42.671933 6466 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-controller-manager-operator for network=default : 2.21187ms\\\\nI0930 19:33:42.671968 6466 services_controller.go:356] Processing sync for service openshift-kube-storage-version-migrator-operator/metrics for network=default\\\\nF0930 19:33:42.672345 6466 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:33:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz6qz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fmsrf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.635543 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bf65d04-1873-4650-868c-076118fd4dd5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae915f74f875c4b4aae052da19bcce9693322ed42573211a3fd458761891b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fb42a36284069544112bc5523c6c89b0d0cae4b3cfd7bb292a05691e1de01cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://196c8c016a4602b9d6bda11b4c30276c3536485dc8c75529c0c7059816768d25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba26c047b943df107ca78816932da5897d45635cd55d80be668dde33e3ce2c2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.655766 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.663574 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.663650 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.663670 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.663701 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.663720 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.673450 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e817c67-7688-42d4-8a82-ce72282cbb51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2bc0407cf9068339746db9fc480a422b218894086c0e8bf84ffe91b855fefcfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lhn66\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-9n4dl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.694236 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.710620 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-46cs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1baa362f-a5ec-4459-9108-66da9e4195de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d1694c12847ee9e1b9defc763e99a59319ee1110f5f54cdc96f83e5f9413c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:50Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-46cs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.732652 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff8d817f-2918-40e6-92da-79585629fb4c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ae6eef7edd538ebef5374a4feae9a8889fb7d671191b3cc3d86f083fc60b80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd673fbde795ab7edca5fcb2a0cbf9115639b912f914075014349f1649132d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.757999 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6a9221e6b245e1a3fe2af879e85d108b499482c19d40bc9e93d5669bbee160\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f4b24a776590efb982a4bafcabf6ac3b7c933ce36f6d2ecaef2bddc97d0f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.769196 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.769550 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.769676 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.769788 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.769898 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.776223 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-swqk9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"584c5bac-180e-46de-8e53-6586f27f2cea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn7xc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-swqk9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.802676 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b7b8059-b38b-4faf-8a46-ad5a8489cf21\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5f4c98a73ac89f5c3090fdfdcbdf11510f24052696527a035cba02970603aeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://242405f72431cefc2968a455ee41ad08d2334ccfe30c3e694a904d9848e40244\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://046e6d30d737e40f8a8aaa9754dd26ca484891aac7ffa61fbe4c0e7c99ca7b66\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc343784f8d9b18d612b22f306cfb9defdac905dae457e775d9511fc1d6ffe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b5508b17529def0463f509e8b75a7d0c5f20adddcf2d09961ac65d41a65decc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31f81024da7797e8f08567e4ca44cd83692a3a65aa9e32817ac593db902a3ed0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c44046c3b095bcaf615c96d94b68ea9a802aeb726d4edd46f7d65a66407ce7f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T19:32:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kltgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qwr6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.825549 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eeadb89d-f9b0-42ae-968a-7247e022858a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5367fccd33992460324676e8118b4d00686f6c29d175c684a03b95590543528\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c70e1b39ff079d93f39dcda5450bd444cb4ed9cc40841966f557bb1baa92c03e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc89d2d8a288cb967c6f7d0b01ac45d75e0ed7774a4489a07f3f4ed00db323b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca2182ad76e535173473a6a010378b0fc1dda15fafa53e6c588654e674baccab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:27Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.851659 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b678c605c31a646ca80647e9b4268e3bd72adbe5882fd21397897134c83bdbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.869198 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5808e944ebf1cf8d852a59f6cb9dc06fe463d6f12b2ba1c64d82d10378ea840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:32:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.873868 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.873952 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.873994 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.874015 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.874031 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.885715 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.907267 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vzlwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d6b9396-3666-49a3-9d06-f764a3b39edf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:32:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T19:33:35Z\\\",\\\"message\\\":\\\"2025-09-30T19:32:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b\\\\n2025-09-30T19:32:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ef45469-9d75-4426-af8c-84b87f830e7b to /host/opt/cni/bin/\\\\n2025-09-30T19:32:50Z [verbose] multus-daemon started\\\\n2025-09-30T19:32:50Z [verbose] Readiness Indicator file check\\\\n2025-09-30T19:33:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T19:32:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpc46\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:32:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vzlwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.924885 4553 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884f4a42-261c-4547-95da-20ba542ce60b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed791878cd40c8c23f9396cbcee6de466f1c23e4e9c8aa7d6025c3cfa0a8acd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3119591bff0fd9bdaaf6e2987f29223dab8dd6403fbce24eabb30e610ecaedcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T19:33:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggkbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T19:33:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5szqp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:47Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.977880 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.977948 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.977962 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.977985 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:47 crc kubenswrapper[4553]: I0930 19:33:47.978001 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:47Z","lastTransitionTime":"2025-09-30T19:33:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.081182 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.081250 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.081272 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.081300 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.081321 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.184098 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.184175 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.184194 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.184228 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.184248 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.287581 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.287644 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.287661 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.287685 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.287702 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.390923 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.392112 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.392146 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.392173 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.392190 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.495353 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.495428 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.495449 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.495475 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.495494 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.597911 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.598785 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.598998 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.599271 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.599429 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.696419 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.696504 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.696531 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.696570 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.696595 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: E0930 19:33:48.724216 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.732331 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.732401 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.732425 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.732458 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.732482 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: E0930 19:33:48.753656 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.760555 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.760602 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.760616 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.760635 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.760648 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: E0930 19:33:48.783202 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.789476 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.789515 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.789524 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.789539 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.789550 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: E0930 19:33:48.809901 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.816184 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.816272 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.816298 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.816328 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.816347 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: E0930 19:33:48.838776 4553 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148060Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608860Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T19:33:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"825ea34c-fb99-4283-90cd-f6aa86e2aea9\\\",\\\"systemUUID\\\":\\\"f7f0cc3b-9e9e-4494-aa54-d7d914be9c4b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T19:33:48Z is after 2025-08-24T17:21:41Z" Sep 30 19:33:48 crc kubenswrapper[4553]: E0930 19:33:48.839005 4553 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.841220 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.841283 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.841309 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.841346 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.841370 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.945435 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.945507 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.945532 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.945561 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:48 crc kubenswrapper[4553]: I0930 19:33:48.945581 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:48Z","lastTransitionTime":"2025-09-30T19:33:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.049033 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.049143 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.049177 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.049210 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.049232 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.153239 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.153307 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.153331 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.153386 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.153413 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.263966 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.264072 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.264097 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.264126 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.264147 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.374630 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.374700 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.374717 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.374751 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.374769 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.477954 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.478023 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.478066 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.478094 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.478113 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.503497 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.503519 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.503581 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:49 crc kubenswrapper[4553]: E0930 19:33:49.503841 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.503963 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:49 crc kubenswrapper[4553]: E0930 19:33:49.504159 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:49 crc kubenswrapper[4553]: E0930 19:33:49.504288 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:49 crc kubenswrapper[4553]: E0930 19:33:49.504392 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.582617 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.582678 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.582696 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.582720 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.582739 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.685654 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.685711 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.685728 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.685750 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.685767 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.788472 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.788555 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.788578 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.788612 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.788636 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.891670 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.891721 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.891736 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.891762 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.891779 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.995177 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.995252 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.995277 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.995307 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:49 crc kubenswrapper[4553]: I0930 19:33:49.995327 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:49Z","lastTransitionTime":"2025-09-30T19:33:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.098381 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.098462 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.098476 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.098495 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.098510 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.201738 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.201808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.201861 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.201888 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.201906 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.305103 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.305227 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.305254 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.305285 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.305306 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.409268 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.409338 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.409361 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.409393 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.409415 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.517290 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.517366 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.517383 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.517411 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.517428 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.620245 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.620375 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.620403 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.620435 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.620460 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.724122 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.724818 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.724859 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.724890 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.724907 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.829246 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.829325 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.829344 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.829369 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.829388 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.933342 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.933422 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.933442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.933503 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:50 crc kubenswrapper[4553]: I0930 19:33:50.933525 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:50Z","lastTransitionTime":"2025-09-30T19:33:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.036853 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.036928 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.036953 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.036983 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.037005 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.140961 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.141018 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.141033 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.141079 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.141092 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.229800 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.229915 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.229947 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:55.229923935 +0000 UTC m=+148.429426075 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.230000 4553 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.230015 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.230060 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:34:55.230033268 +0000 UTC m=+148.429535398 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.230127 4553 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.230173 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 19:34:55.230163031 +0000 UTC m=+148.429665161 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.244471 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.244500 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.244511 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.244531 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.244544 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.331476 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.331594 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.331835 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.331839 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.331912 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.331935 4553 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.331865 4553 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.332014 4553 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.332023 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 19:34:55.331996173 +0000 UTC m=+148.531498323 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.332168 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 19:34:55.332139297 +0000 UTC m=+148.531641617 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.347772 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.347870 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.347887 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.347907 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.347947 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.450867 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.451321 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.451338 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.451361 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.451371 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.503519 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.503633 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.503684 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.503704 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.503851 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.503909 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.504277 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.504449 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.533539 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.533719 4553 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: E0930 19:33:51.533785 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs podName:584c5bac-180e-46de-8e53-6586f27f2cea nodeName:}" failed. No retries permitted until 2025-09-30 19:34:55.533765325 +0000 UTC m=+148.733267455 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs") pod "network-metrics-daemon-swqk9" (UID: "584c5bac-180e-46de-8e53-6586f27f2cea") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.554121 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.554179 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.554192 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.554210 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.554224 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.656660 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.656735 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.656758 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.656790 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.656814 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.765324 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.765369 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.765380 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.765395 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.765406 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.869073 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.869117 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.869128 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.869143 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.869152 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.973142 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.973221 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.973244 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.973271 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:51 crc kubenswrapper[4553]: I0930 19:33:51.973290 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:51Z","lastTransitionTime":"2025-09-30T19:33:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.077199 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.077270 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.077293 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.077324 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.077346 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.180679 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.180730 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.180790 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.180815 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.180832 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.284671 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.284735 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.284757 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.284785 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.284807 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.388648 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.388711 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.388723 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.388743 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.388752 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.493818 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.493880 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.493891 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.493908 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.493919 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.596852 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.596889 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.596898 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.596914 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.596926 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.700451 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.700523 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.700544 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.700567 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.700588 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.803698 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.803782 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.803808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.803839 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.803861 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.906912 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.906944 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.906955 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.906976 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:52 crc kubenswrapper[4553]: I0930 19:33:52.906991 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:52Z","lastTransitionTime":"2025-09-30T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.009850 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.009904 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.009914 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.009927 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.009953 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.113084 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.113150 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.113165 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.113229 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.113249 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.217287 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.217343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.217356 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.217379 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.217392 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.322521 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.322611 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.322644 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.322677 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.322699 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.426127 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.426219 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.426238 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.426262 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.426281 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.503983 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.504062 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.504085 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:53 crc kubenswrapper[4553]: E0930 19:33:53.504184 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.504316 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:53 crc kubenswrapper[4553]: E0930 19:33:53.504345 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:53 crc kubenswrapper[4553]: E0930 19:33:53.504567 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:53 crc kubenswrapper[4553]: E0930 19:33:53.504711 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.528808 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.528873 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.528887 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.528910 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.528926 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.631519 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.631573 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.631594 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.631619 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.631639 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.734595 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.734671 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.734693 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.734724 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.734746 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.839197 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.839277 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.839293 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.839343 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.839363 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.945087 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.945139 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.945156 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.945179 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:53 crc kubenswrapper[4553]: I0930 19:33:53.945196 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:53Z","lastTransitionTime":"2025-09-30T19:33:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.048580 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.048664 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.048691 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.048721 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.048746 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.151487 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.151562 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.151581 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.151606 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.151623 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.255677 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.255720 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.255736 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.255757 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.255770 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.359625 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.359741 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.359759 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.359782 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.359799 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.463133 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.463201 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.463221 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.463258 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.463281 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.567362 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.567423 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.567442 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.567464 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.567482 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.677838 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.677899 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.677921 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.677949 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.677972 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.783392 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.783464 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.783486 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.783516 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.783538 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.887801 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.887929 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.887996 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.888030 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.888133 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.990707 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.990754 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.990767 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.990782 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:54 crc kubenswrapper[4553]: I0930 19:33:54.990795 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:54Z","lastTransitionTime":"2025-09-30T19:33:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.094013 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.094095 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.094111 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.094131 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.094146 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.197487 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.197543 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.197551 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.197564 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.197573 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.301113 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.301177 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.301190 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.301208 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.301218 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.404283 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.404321 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.404334 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.404352 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.404364 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.503739 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.503739 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.503807 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.503879 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:55 crc kubenswrapper[4553]: E0930 19:33:55.504102 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:55 crc kubenswrapper[4553]: E0930 19:33:55.504239 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:55 crc kubenswrapper[4553]: E0930 19:33:55.504343 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:55 crc kubenswrapper[4553]: E0930 19:33:55.505122 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.505294 4553 scope.go:117] "RemoveContainer" containerID="58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c" Sep 30 19:33:55 crc kubenswrapper[4553]: E0930 19:33:55.505472 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.507169 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.507199 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.507209 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.507225 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.507237 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.610943 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.611006 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.611024 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.611077 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.611098 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.714529 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.714594 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.714611 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.714637 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.714656 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.817802 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.817859 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.817877 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.817901 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.817919 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.921528 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.921621 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.921642 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.921673 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:55 crc kubenswrapper[4553]: I0930 19:33:55.921699 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:55Z","lastTransitionTime":"2025-09-30T19:33:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.025258 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.025338 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.025361 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.025393 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.025418 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.128391 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.128461 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.128481 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.128510 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.128529 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.231544 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.231584 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.231596 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.231616 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.231629 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.335599 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.335685 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.335712 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.335751 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.335777 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.438811 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.438898 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.438913 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.438938 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.438954 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.541643 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.541714 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.541733 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.541763 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.541781 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.644446 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.644493 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.644502 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.644514 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.644523 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.747793 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.747959 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.747989 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.748016 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.748072 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.851356 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.851418 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.851435 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.851457 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.851475 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.955971 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.956096 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.956142 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.956178 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:56 crc kubenswrapper[4553]: I0930 19:33:56.956199 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:56Z","lastTransitionTime":"2025-09-30T19:33:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.058636 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.058682 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.058699 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.058724 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.058741 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.161663 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.161713 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.161726 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.161743 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.161754 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.265151 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.265206 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.265224 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.265248 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.265265 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.368844 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.368893 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.368911 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.368934 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.368951 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.472763 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.472839 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.472848 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.472864 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.472877 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.503613 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:57 crc kubenswrapper[4553]: E0930 19:33:57.503980 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.504001 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.504073 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.504030 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:57 crc kubenswrapper[4553]: E0930 19:33:57.504156 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:57 crc kubenswrapper[4553]: E0930 19:33:57.504310 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:57 crc kubenswrapper[4553]: E0930 19:33:57.504439 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.554277 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.554251064 podStartE2EDuration="1m10.554251064s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.552142016 +0000 UTC m=+90.751644236" watchObservedRunningTime="2025-09-30 19:33:57.554251064 +0000 UTC m=+90.753753194" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.577347 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.577419 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.577446 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.577480 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.577505 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.642904 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p4qgs" podStartSLOduration=70.642884059 podStartE2EDuration="1m10.642884059s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.598120979 +0000 UTC m=+90.797623139" watchObservedRunningTime="2025-09-30 19:33:57.642884059 +0000 UTC m=+90.842386189" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.680407 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.680445 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.680458 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.680473 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.680484 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.685689 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.685673315 podStartE2EDuration="1m8.685673315s" podCreationTimestamp="2025-09-30 19:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.671255775 +0000 UTC m=+90.870757925" watchObservedRunningTime="2025-09-30 19:33:57.685673315 +0000 UTC m=+90.885175445" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.711751 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podStartSLOduration=70.711713958 podStartE2EDuration="1m10.711713958s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.698209944 +0000 UTC m=+90.897712074" watchObservedRunningTime="2025-09-30 19:33:57.711713958 +0000 UTC m=+90.911216078" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.722020 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-46cs9" podStartSLOduration=69.722001627 podStartE2EDuration="1m9.722001627s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.721482972 +0000 UTC m=+90.920985102" watchObservedRunningTime="2025-09-30 19:33:57.722001627 +0000 UTC m=+90.921503757" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.747384 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.747363872 podStartE2EDuration="42.747363872s" podCreationTimestamp="2025-09-30 19:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.733521508 +0000 UTC m=+90.933023658" watchObservedRunningTime="2025-09-30 19:33:57.747363872 +0000 UTC m=+90.946866002" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.783775 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.783810 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.783821 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.783836 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.783847 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.813374 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qwr6w" podStartSLOduration=70.813350625 podStartE2EDuration="1m10.813350625s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.792020199 +0000 UTC m=+90.991522349" watchObservedRunningTime="2025-09-30 19:33:57.813350625 +0000 UTC m=+91.012852765" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.840546 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=20.840522109 podStartE2EDuration="20.840522109s" podCreationTimestamp="2025-09-30 19:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.814124065 +0000 UTC m=+91.013626195" watchObservedRunningTime="2025-09-30 19:33:57.840522109 +0000 UTC m=+91.040024239" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.885673 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.885706 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.885714 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.885727 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.885737 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.895770 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vzlwd" podStartSLOduration=70.895750262 podStartE2EDuration="1m10.895750262s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.894411625 +0000 UTC m=+91.093913765" watchObservedRunningTime="2025-09-30 19:33:57.895750262 +0000 UTC m=+91.095252392" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.935838 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5szqp" podStartSLOduration=69.935814174 podStartE2EDuration="1m9.935814174s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.912269167 +0000 UTC m=+91.111771297" watchObservedRunningTime="2025-09-30 19:33:57.935814174 +0000 UTC m=+91.135316314" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.936703 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.936696618 podStartE2EDuration="1m7.936696618s" podCreationTimestamp="2025-09-30 19:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:33:57.935603138 +0000 UTC m=+91.135105268" watchObservedRunningTime="2025-09-30 19:33:57.936696618 +0000 UTC m=+91.136198748" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.989214 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.989254 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.989264 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.989278 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:57 crc kubenswrapper[4553]: I0930 19:33:57.989289 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:57Z","lastTransitionTime":"2025-09-30T19:33:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.091361 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.091410 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.091424 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.091444 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.091456 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.194490 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.194528 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.194537 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.194552 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.194563 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.296946 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.297002 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.297019 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.297069 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.297087 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.403943 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.404001 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.404015 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.404056 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.404073 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.506606 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.506674 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.506690 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.506711 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.506727 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.609905 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.609960 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.609970 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.609988 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.610001 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.713520 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.713598 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.713611 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.713651 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.713666 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.817371 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.817437 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.817448 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.817466 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.817477 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.848344 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.848427 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.848454 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.848486 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.848509 4553 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T19:33:58Z","lastTransitionTime":"2025-09-30T19:33:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.915544 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr"] Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.916233 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.919318 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.919705 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.919736 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 19:33:58 crc kubenswrapper[4553]: I0930 19:33:58.921849 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.021621 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/de5ddfe3-a57a-492f-adc5-46ac07352ce4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.021932 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/de5ddfe3-a57a-492f-adc5-46ac07352ce4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.022068 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de5ddfe3-a57a-492f-adc5-46ac07352ce4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.022217 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de5ddfe3-a57a-492f-adc5-46ac07352ce4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.022291 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de5ddfe3-a57a-492f-adc5-46ac07352ce4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.123889 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/de5ddfe3-a57a-492f-adc5-46ac07352ce4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.124011 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/de5ddfe3-a57a-492f-adc5-46ac07352ce4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.124121 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de5ddfe3-a57a-492f-adc5-46ac07352ce4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.124155 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de5ddfe3-a57a-492f-adc5-46ac07352ce4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.124196 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de5ddfe3-a57a-492f-adc5-46ac07352ce4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.124203 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/de5ddfe3-a57a-492f-adc5-46ac07352ce4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.124440 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/de5ddfe3-a57a-492f-adc5-46ac07352ce4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.125995 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de5ddfe3-a57a-492f-adc5-46ac07352ce4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.137498 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de5ddfe3-a57a-492f-adc5-46ac07352ce4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.162234 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de5ddfe3-a57a-492f-adc5-46ac07352ce4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bz7wr\" (UID: \"de5ddfe3-a57a-492f-adc5-46ac07352ce4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.241617 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" Sep 30 19:33:59 crc kubenswrapper[4553]: W0930 19:33:59.264181 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5ddfe3_a57a_492f_adc5_46ac07352ce4.slice/crio-e1d0ab5141223412e7d9c920c61be454544c60e4148917a9695c90e4d918ea74 WatchSource:0}: Error finding container e1d0ab5141223412e7d9c920c61be454544c60e4148917a9695c90e4d918ea74: Status 404 returned error can't find the container with id e1d0ab5141223412e7d9c920c61be454544c60e4148917a9695c90e4d918ea74 Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.503165 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.503245 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.503244 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:33:59 crc kubenswrapper[4553]: E0930 19:33:59.503359 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:33:59 crc kubenswrapper[4553]: I0930 19:33:59.503396 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:33:59 crc kubenswrapper[4553]: E0930 19:33:59.503536 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:33:59 crc kubenswrapper[4553]: E0930 19:33:59.503709 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:33:59 crc kubenswrapper[4553]: E0930 19:33:59.503776 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:00 crc kubenswrapper[4553]: I0930 19:34:00.062718 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" event={"ID":"de5ddfe3-a57a-492f-adc5-46ac07352ce4","Type":"ContainerStarted","Data":"6a4cd90ab512a046f7023d8f1def95e57d12f6be513b36a8e4f7f82a71a3709e"} Sep 30 19:34:00 crc kubenswrapper[4553]: I0930 19:34:00.062843 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" event={"ID":"de5ddfe3-a57a-492f-adc5-46ac07352ce4","Type":"ContainerStarted","Data":"e1d0ab5141223412e7d9c920c61be454544c60e4148917a9695c90e4d918ea74"} Sep 30 19:34:00 crc kubenswrapper[4553]: I0930 19:34:00.089063 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bz7wr" podStartSLOduration=72.089018286 podStartE2EDuration="1m12.089018286s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:00.088295947 +0000 UTC m=+93.287798107" watchObservedRunningTime="2025-09-30 19:34:00.089018286 +0000 UTC m=+93.288520446" Sep 30 19:34:01 crc kubenswrapper[4553]: I0930 19:34:01.503953 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:01 crc kubenswrapper[4553]: I0930 19:34:01.504013 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:01 crc kubenswrapper[4553]: I0930 19:34:01.504171 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:01 crc kubenswrapper[4553]: E0930 19:34:01.504333 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:01 crc kubenswrapper[4553]: I0930 19:34:01.504647 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:01 crc kubenswrapper[4553]: E0930 19:34:01.504753 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:01 crc kubenswrapper[4553]: E0930 19:34:01.504964 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:01 crc kubenswrapper[4553]: E0930 19:34:01.505213 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:03 crc kubenswrapper[4553]: I0930 19:34:03.503764 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:03 crc kubenswrapper[4553]: E0930 19:34:03.504443 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:03 crc kubenswrapper[4553]: I0930 19:34:03.503847 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:03 crc kubenswrapper[4553]: E0930 19:34:03.504573 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:03 crc kubenswrapper[4553]: I0930 19:34:03.503936 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:03 crc kubenswrapper[4553]: E0930 19:34:03.504681 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:03 crc kubenswrapper[4553]: I0930 19:34:03.503785 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:03 crc kubenswrapper[4553]: E0930 19:34:03.504769 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:05 crc kubenswrapper[4553]: I0930 19:34:05.504332 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:05 crc kubenswrapper[4553]: I0930 19:34:05.504439 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:05 crc kubenswrapper[4553]: E0930 19:34:05.504485 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:05 crc kubenswrapper[4553]: I0930 19:34:05.504555 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:05 crc kubenswrapper[4553]: I0930 19:34:05.504721 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:05 crc kubenswrapper[4553]: E0930 19:34:05.504712 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:05 crc kubenswrapper[4553]: E0930 19:34:05.504846 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:05 crc kubenswrapper[4553]: E0930 19:34:05.504874 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:07 crc kubenswrapper[4553]: I0930 19:34:07.503856 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:07 crc kubenswrapper[4553]: I0930 19:34:07.504014 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:07 crc kubenswrapper[4553]: I0930 19:34:07.504129 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:07 crc kubenswrapper[4553]: E0930 19:34:07.504232 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:07 crc kubenswrapper[4553]: I0930 19:34:07.504330 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:07 crc kubenswrapper[4553]: E0930 19:34:07.505903 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:07 crc kubenswrapper[4553]: E0930 19:34:07.506007 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:07 crc kubenswrapper[4553]: E0930 19:34:07.506080 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:09 crc kubenswrapper[4553]: I0930 19:34:09.503435 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:09 crc kubenswrapper[4553]: I0930 19:34:09.503541 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:09 crc kubenswrapper[4553]: I0930 19:34:09.503657 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:09 crc kubenswrapper[4553]: E0930 19:34:09.503873 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:09 crc kubenswrapper[4553]: I0930 19:34:09.504387 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:09 crc kubenswrapper[4553]: E0930 19:34:09.504400 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:09 crc kubenswrapper[4553]: E0930 19:34:09.504709 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:09 crc kubenswrapper[4553]: E0930 19:34:09.504861 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:10 crc kubenswrapper[4553]: I0930 19:34:10.503851 4553 scope.go:117] "RemoveContainer" containerID="58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c" Sep 30 19:34:10 crc kubenswrapper[4553]: E0930 19:34:10.504055 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fmsrf_openshift-ovn-kubernetes(4457466e-c6fd-4a2f-8b73-c205c50f90e3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" Sep 30 19:34:11 crc kubenswrapper[4553]: I0930 19:34:11.503890 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:11 crc kubenswrapper[4553]: I0930 19:34:11.503890 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:11 crc kubenswrapper[4553]: I0930 19:34:11.503909 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:11 crc kubenswrapper[4553]: E0930 19:34:11.504101 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:11 crc kubenswrapper[4553]: E0930 19:34:11.504206 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:11 crc kubenswrapper[4553]: E0930 19:34:11.504474 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:11 crc kubenswrapper[4553]: I0930 19:34:11.504913 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:11 crc kubenswrapper[4553]: E0930 19:34:11.505124 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:13 crc kubenswrapper[4553]: I0930 19:34:13.503684 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:13 crc kubenswrapper[4553]: E0930 19:34:13.503835 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:13 crc kubenswrapper[4553]: I0930 19:34:13.504088 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:13 crc kubenswrapper[4553]: E0930 19:34:13.504160 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:13 crc kubenswrapper[4553]: I0930 19:34:13.504385 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:13 crc kubenswrapper[4553]: E0930 19:34:13.504453 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:13 crc kubenswrapper[4553]: I0930 19:34:13.505211 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:13 crc kubenswrapper[4553]: E0930 19:34:13.505427 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:15 crc kubenswrapper[4553]: I0930 19:34:15.504170 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:15 crc kubenswrapper[4553]: E0930 19:34:15.504342 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:15 crc kubenswrapper[4553]: I0930 19:34:15.504908 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:15 crc kubenswrapper[4553]: I0930 19:34:15.504992 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:15 crc kubenswrapper[4553]: I0930 19:34:15.505065 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:15 crc kubenswrapper[4553]: E0930 19:34:15.505254 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:15 crc kubenswrapper[4553]: E0930 19:34:15.505315 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:15 crc kubenswrapper[4553]: E0930 19:34:15.505363 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:17 crc kubenswrapper[4553]: I0930 19:34:17.504014 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:17 crc kubenswrapper[4553]: I0930 19:34:17.504181 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:17 crc kubenswrapper[4553]: E0930 19:34:17.505963 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:17 crc kubenswrapper[4553]: I0930 19:34:17.506013 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:17 crc kubenswrapper[4553]: I0930 19:34:17.506096 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:17 crc kubenswrapper[4553]: E0930 19:34:17.506287 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:17 crc kubenswrapper[4553]: E0930 19:34:17.506373 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:17 crc kubenswrapper[4553]: E0930 19:34:17.506467 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:19 crc kubenswrapper[4553]: I0930 19:34:19.503548 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:19 crc kubenswrapper[4553]: I0930 19:34:19.503698 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:19 crc kubenswrapper[4553]: E0930 19:34:19.503905 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:19 crc kubenswrapper[4553]: I0930 19:34:19.503942 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:19 crc kubenswrapper[4553]: I0930 19:34:19.503978 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:19 crc kubenswrapper[4553]: E0930 19:34:19.504168 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:19 crc kubenswrapper[4553]: E0930 19:34:19.504326 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:19 crc kubenswrapper[4553]: E0930 19:34:19.504599 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:21 crc kubenswrapper[4553]: I0930 19:34:21.504073 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:21 crc kubenswrapper[4553]: I0930 19:34:21.504033 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:21 crc kubenswrapper[4553]: E0930 19:34:21.504312 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:21 crc kubenswrapper[4553]: I0930 19:34:21.504346 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:21 crc kubenswrapper[4553]: E0930 19:34:21.504491 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:21 crc kubenswrapper[4553]: E0930 19:34:21.504622 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:21 crc kubenswrapper[4553]: I0930 19:34:21.504699 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:21 crc kubenswrapper[4553]: E0930 19:34:21.504846 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:22 crc kubenswrapper[4553]: I0930 19:34:22.150147 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/1.log" Sep 30 19:34:22 crc kubenswrapper[4553]: I0930 19:34:22.151359 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/0.log" Sep 30 19:34:22 crc kubenswrapper[4553]: I0930 19:34:22.151468 4553 generic.go:334] "Generic (PLEG): container finished" podID="0d6b9396-3666-49a3-9d06-f764a3b39edf" containerID="b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93" exitCode=1 Sep 30 19:34:22 crc kubenswrapper[4553]: I0930 19:34:22.151529 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzlwd" event={"ID":"0d6b9396-3666-49a3-9d06-f764a3b39edf","Type":"ContainerDied","Data":"b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93"} Sep 30 19:34:22 crc kubenswrapper[4553]: I0930 19:34:22.152694 4553 scope.go:117] "RemoveContainer" containerID="f3001c6feeb55a51136693374dbc01cd82fd3c523cf19ea27debdd2ce2206e00" Sep 30 19:34:22 crc kubenswrapper[4553]: I0930 19:34:22.153448 4553 scope.go:117] "RemoveContainer" containerID="b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93" Sep 30 19:34:22 crc kubenswrapper[4553]: E0930 19:34:22.160671 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-vzlwd_openshift-multus(0d6b9396-3666-49a3-9d06-f764a3b39edf)\"" pod="openshift-multus/multus-vzlwd" podUID="0d6b9396-3666-49a3-9d06-f764a3b39edf" Sep 30 19:34:23 crc kubenswrapper[4553]: I0930 19:34:23.157291 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/1.log" Sep 30 19:34:23 crc kubenswrapper[4553]: I0930 19:34:23.503546 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:23 crc kubenswrapper[4553]: I0930 19:34:23.503635 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:23 crc kubenswrapper[4553]: I0930 19:34:23.503561 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:23 crc kubenswrapper[4553]: E0930 19:34:23.503713 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:23 crc kubenswrapper[4553]: E0930 19:34:23.503816 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:23 crc kubenswrapper[4553]: I0930 19:34:23.503840 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:23 crc kubenswrapper[4553]: E0930 19:34:23.503954 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:23 crc kubenswrapper[4553]: E0930 19:34:23.504132 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:24 crc kubenswrapper[4553]: I0930 19:34:24.504363 4553 scope.go:117] "RemoveContainer" containerID="58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c" Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.165915 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/3.log" Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.169327 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerStarted","Data":"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148"} Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.169847 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.200473 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podStartSLOduration=98.200445814 podStartE2EDuration="1m38.200445814s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:25.199851448 +0000 UTC m=+118.399353588" watchObservedRunningTime="2025-09-30 19:34:25.200445814 +0000 UTC m=+118.399947974" Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.504102 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:25 crc kubenswrapper[4553]: E0930 19:34:25.504230 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.504268 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.504280 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:25 crc kubenswrapper[4553]: E0930 19:34:25.504417 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.504457 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:25 crc kubenswrapper[4553]: E0930 19:34:25.504530 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:25 crc kubenswrapper[4553]: E0930 19:34:25.504598 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:25 crc kubenswrapper[4553]: I0930 19:34:25.509256 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-swqk9"] Sep 30 19:34:26 crc kubenswrapper[4553]: I0930 19:34:26.172356 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:26 crc kubenswrapper[4553]: E0930 19:34:26.172533 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:27 crc kubenswrapper[4553]: I0930 19:34:27.504302 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:27 crc kubenswrapper[4553]: E0930 19:34:27.506269 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:27 crc kubenswrapper[4553]: I0930 19:34:27.506517 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:27 crc kubenswrapper[4553]: E0930 19:34:27.506831 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:27 crc kubenswrapper[4553]: I0930 19:34:27.507007 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:27 crc kubenswrapper[4553]: E0930 19:34:27.507297 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:27 crc kubenswrapper[4553]: E0930 19:34:27.521074 4553 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 19:34:27 crc kubenswrapper[4553]: E0930 19:34:27.599692 4553 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 19:34:28 crc kubenswrapper[4553]: I0930 19:34:28.503213 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:28 crc kubenswrapper[4553]: E0930 19:34:28.503901 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:29 crc kubenswrapper[4553]: I0930 19:34:29.503522 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:29 crc kubenswrapper[4553]: I0930 19:34:29.503546 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:29 crc kubenswrapper[4553]: I0930 19:34:29.503546 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:29 crc kubenswrapper[4553]: E0930 19:34:29.504470 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:29 crc kubenswrapper[4553]: E0930 19:34:29.504641 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:29 crc kubenswrapper[4553]: E0930 19:34:29.504929 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:30 crc kubenswrapper[4553]: I0930 19:34:30.503852 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:30 crc kubenswrapper[4553]: E0930 19:34:30.504086 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:31 crc kubenswrapper[4553]: I0930 19:34:31.503403 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:31 crc kubenswrapper[4553]: I0930 19:34:31.503462 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:31 crc kubenswrapper[4553]: I0930 19:34:31.503533 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:31 crc kubenswrapper[4553]: E0930 19:34:31.504325 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:31 crc kubenswrapper[4553]: E0930 19:34:31.504388 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:31 crc kubenswrapper[4553]: E0930 19:34:31.505388 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:32 crc kubenswrapper[4553]: I0930 19:34:32.503392 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:32 crc kubenswrapper[4553]: E0930 19:34:32.503584 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:32 crc kubenswrapper[4553]: E0930 19:34:32.601727 4553 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 19:34:33 crc kubenswrapper[4553]: I0930 19:34:33.503180 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:33 crc kubenswrapper[4553]: E0930 19:34:33.503400 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:33 crc kubenswrapper[4553]: I0930 19:34:33.503180 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:33 crc kubenswrapper[4553]: E0930 19:34:33.503757 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:33 crc kubenswrapper[4553]: I0930 19:34:33.503977 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:33 crc kubenswrapper[4553]: E0930 19:34:33.504280 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:34 crc kubenswrapper[4553]: I0930 19:34:34.503858 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:34 crc kubenswrapper[4553]: E0930 19:34:34.504931 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:34 crc kubenswrapper[4553]: I0930 19:34:34.504690 4553 scope.go:117] "RemoveContainer" containerID="b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93" Sep 30 19:34:35 crc kubenswrapper[4553]: I0930 19:34:35.211686 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/1.log" Sep 30 19:34:35 crc kubenswrapper[4553]: I0930 19:34:35.212183 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzlwd" event={"ID":"0d6b9396-3666-49a3-9d06-f764a3b39edf","Type":"ContainerStarted","Data":"81d6a88a8c1b8af5edd73d213278c902fce9950b02c0160d289373bf4061862a"} Sep 30 19:34:35 crc kubenswrapper[4553]: I0930 19:34:35.503198 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:35 crc kubenswrapper[4553]: I0930 19:34:35.503257 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:35 crc kubenswrapper[4553]: E0930 19:34:35.503380 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:35 crc kubenswrapper[4553]: I0930 19:34:35.503414 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:35 crc kubenswrapper[4553]: E0930 19:34:35.503564 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:35 crc kubenswrapper[4553]: E0930 19:34:35.503751 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:36 crc kubenswrapper[4553]: I0930 19:34:36.503470 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:36 crc kubenswrapper[4553]: E0930 19:34:36.503723 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swqk9" podUID="584c5bac-180e-46de-8e53-6586f27f2cea" Sep 30 19:34:37 crc kubenswrapper[4553]: I0930 19:34:37.503232 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:37 crc kubenswrapper[4553]: I0930 19:34:37.503304 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:37 crc kubenswrapper[4553]: I0930 19:34:37.503432 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:37 crc kubenswrapper[4553]: E0930 19:34:37.504129 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 19:34:37 crc kubenswrapper[4553]: E0930 19:34:37.504646 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 19:34:37 crc kubenswrapper[4553]: E0930 19:34:37.505328 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 19:34:38 crc kubenswrapper[4553]: I0930 19:34:38.503262 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:38 crc kubenswrapper[4553]: I0930 19:34:38.507939 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 19:34:38 crc kubenswrapper[4553]: I0930 19:34:38.508153 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.453865 4553 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.507162 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.507660 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.508147 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.511497 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.511593 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.511872 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.512326 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.520149 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z7r6d"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.520774 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.522465 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-djhpv"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.523205 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.524210 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5dq4n"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.524823 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.525611 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.526257 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.527205 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.527234 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.528876 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.529304 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.529922 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.530405 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.534167 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.534301 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.534624 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.534906 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.535261 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.535438 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rgm8h"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.536096 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2chmh"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.536165 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.536296 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.536632 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.537136 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.537321 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.537471 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.538527 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.539004 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-6csmn"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.539452 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.539957 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.540168 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.541235 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.542303 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6bqsb"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.542879 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.546099 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-j2fv9"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.546631 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-j2fv9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.547367 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.548632 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.560380 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.562780 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.582055 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.582306 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.582316 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.582401 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.582587 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.582625 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.582790 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.582927 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.583356 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.583481 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.583730 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.583940 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.584097 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.584203 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.586430 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.586694 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.586877 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.587089 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.587136 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.587191 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.589077 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rddkb"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.589565 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.591194 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.591459 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.604243 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.605062 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.605474 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.606418 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.607561 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.607611 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.607986 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.608192 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.608346 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.608902 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.609935 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.610526 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.611830 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.620546 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.620612 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.620725 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.620849 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.620959 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.620989 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621065 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621135 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621210 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621278 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621301 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621370 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621409 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621426 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621486 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621534 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621546 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621608 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621691 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.621870 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.622066 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.622080 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.622270 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.622137 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.622242 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.622769 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6bpj6"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.622828 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.623051 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.623165 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.623242 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.623269 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.623335 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.623344 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.623170 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.623599 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.627132 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.629986 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.632995 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dpgkt"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.633196 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.633501 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z7r6d"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.633519 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-djhpv"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.633591 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.635303 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.635454 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.635620 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.635864 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.636885 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.637976 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5dq4n"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.638865 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.639398 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.640391 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.640516 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.645779 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.645994 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.647004 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.649813 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.650822 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.651247 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.651419 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653616 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653651 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/be6f13fb-81da-4176-8540-a3fa61cd7002-node-pullsecrets\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653673 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd13fb01-b6ec-486e-8b39-7440a349ae64-config\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653691 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653713 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-image-import-ca\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653730 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-config\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653745 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/323f9188-3789-4e7c-b4d2-17f051188a15-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653761 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-audit-dir\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653777 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crc7n\" (UniqueName: \"kubernetes.io/projected/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-kube-api-access-crc7n\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653792 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653806 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653823 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbvmf\" (UniqueName: \"kubernetes.io/projected/5c8d22af-7187-427b-8a5c-24e49c3e96cc-kube-api-access-vbvmf\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653839 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-trusted-ca-bundle\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653853 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653867 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653882 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e38ed88a-7879-4394-b102-aa5ad331aa5e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653899 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38ed88a-7879-4394-b102-aa5ad331aa5e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653914 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5w8z\" (UniqueName: \"kubernetes.io/projected/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-kube-api-access-b5w8z\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653926 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-etcd-client\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653942 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcm9b\" (UniqueName: \"kubernetes.io/projected/fd13fb01-b6ec-486e-8b39-7440a349ae64-kube-api-access-wcm9b\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653957 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a88973e4-6669-4d85-89b6-2d287df271ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653973 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-serving-cert\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.653987 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654000 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-serving-cert\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654015 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-client-ca\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654031 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl577\" (UniqueName: \"kubernetes.io/projected/76d4f83a-1f82-4374-bc4d-601f752d318d-kube-api-access-xl577\") pod \"downloads-7954f5f757-j2fv9\" (UID: \"76d4f83a-1f82-4374-bc4d-601f752d318d\") " pod="openshift-console/downloads-7954f5f757-j2fv9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654063 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-serving-cert\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654087 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-client-ca\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654103 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77393764-0c2d-4822-8558-71f98dbaef2f-config\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654118 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gbgc\" (UniqueName: \"kubernetes.io/projected/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-kube-api-access-6gbgc\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654131 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654144 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-encryption-config\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654161 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-serving-cert\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654179 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd13fb01-b6ec-486e-8b39-7440a349ae64-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654193 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654212 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8hps\" (UniqueName: \"kubernetes.io/projected/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-kube-api-access-d8hps\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654226 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654242 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-config\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654272 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38ed88a-7879-4394-b102-aa5ad331aa5e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654287 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a88973e4-6669-4d85-89b6-2d287df271ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654302 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgk6h\" (UniqueName: \"kubernetes.io/projected/a88973e4-6669-4d85-89b6-2d287df271ea-kube-api-access-bgk6h\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654316 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/323f9188-3789-4e7c-b4d2-17f051188a15-serving-cert\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654330 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654345 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-etcd-client\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654358 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654372 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-encryption-config\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654385 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf498c-034c-431c-ae07-4099724a48a7-serving-cert\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654399 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654416 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-dir\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654430 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-config\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654443 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-config\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654458 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8d22af-7187-427b-8a5c-24e49c3e96cc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654472 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06f52f14-b54f-4666-9413-e299c6ad0f22-serving-cert\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654489 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqw8r\" (UniqueName: \"kubernetes.io/projected/96bf498c-034c-431c-ae07-4099724a48a7-kube-api-access-vqw8r\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654504 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsvhl\" (UniqueName: \"kubernetes.io/projected/77393764-0c2d-4822-8558-71f98dbaef2f-kube-api-access-nsvhl\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654517 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-config\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654531 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csg8v\" (UniqueName: \"kubernetes.io/projected/be6f13fb-81da-4176-8540-a3fa61cd7002-kube-api-access-csg8v\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654546 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fd13fb01-b6ec-486e-8b39-7440a349ae64-images\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654560 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77393764-0c2d-4822-8558-71f98dbaef2f-auth-proxy-config\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654576 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4c5k\" (UniqueName: \"kubernetes.io/projected/323f9188-3789-4e7c-b4d2-17f051188a15-kube-api-access-z4c5k\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654597 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbw75\" (UniqueName: \"kubernetes.io/projected/0e392aad-9ae5-4942-a078-8ef9cbaffb90-kube-api-access-rbw75\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654612 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-service-ca-bundle\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654626 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06f52f14-b54f-4666-9413-e299c6ad0f22-config\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654641 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-oauth-config\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654656 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654671 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-trusted-ca-bundle\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654687 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e392aad-9ae5-4942-a078-8ef9cbaffb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654704 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-audit-policies\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654719 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654748 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-audit\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654763 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be6f13fb-81da-4176-8540-a3fa61cd7002-audit-dir\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654779 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/77393764-0c2d-4822-8558-71f98dbaef2f-machine-approver-tls\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654801 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c8d22af-7187-427b-8a5c-24e49c3e96cc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654815 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5bh\" (UniqueName: \"kubernetes.io/projected/06f52f14-b54f-4666-9413-e299c6ad0f22-kube-api-access-hw5bh\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654837 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-service-ca\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654850 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-policies\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654863 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-etcd-serving-ca\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654888 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-oauth-serving-cert\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654901 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a88973e4-6669-4d85-89b6-2d287df271ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.654916 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06f52f14-b54f-4666-9413-e299c6ad0f22-trusted-ca\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.655104 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.663493 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.668018 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.682770 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.683072 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.683414 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.683547 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.683768 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.686287 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.686718 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nft69"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.686966 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r22xf"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.687381 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.687655 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.687928 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.697468 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.698327 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.699516 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.701746 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pncrd"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.702626 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.707496 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.711407 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.711724 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.712077 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.712328 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.712444 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.720581 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.722315 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.722978 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.723282 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.728941 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.728977 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h6f2z"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.737821 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.738934 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.744406 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.744861 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.745157 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.745554 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.745602 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.745721 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.745774 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.746281 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvmr2"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.746658 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.746970 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.747080 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9xzz2"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.747180 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.747380 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.747822 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rgm8h"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.747887 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.750385 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.750755 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.750769 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-j2fv9"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.750823 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.751827 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.753630 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.753663 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6csmn"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.754471 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.754640 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vm8xl"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.755628 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rddkb"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.755738 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.756059 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758374 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5w8z\" (UniqueName: \"kubernetes.io/projected/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-kube-api-access-b5w8z\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758416 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-etcd-client\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758448 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-metrics-certs\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758475 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-serving-cert\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758499 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758521 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-serving-cert\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758546 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcm9b\" (UniqueName: \"kubernetes.io/projected/fd13fb01-b6ec-486e-8b39-7440a349ae64-kube-api-access-wcm9b\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758568 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a88973e4-6669-4d85-89b6-2d287df271ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758593 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvw2z\" (UniqueName: \"kubernetes.io/projected/f65a9561-f1e7-48f5-ab37-4c59699c0b6f-kube-api-access-qvw2z\") pod \"dns-operator-744455d44c-dpgkt\" (UID: \"f65a9561-f1e7-48f5-ab37-4c59699c0b6f\") " pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758633 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-client-ca\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758656 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl577\" (UniqueName: \"kubernetes.io/projected/76d4f83a-1f82-4374-bc4d-601f752d318d-kube-api-access-xl577\") pod \"downloads-7954f5f757-j2fv9\" (UID: \"76d4f83a-1f82-4374-bc4d-601f752d318d\") " pod="openshift-console/downloads-7954f5f757-j2fv9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758680 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-serving-cert\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758707 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-client-ca\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758729 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77393764-0c2d-4822-8558-71f98dbaef2f-config\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758753 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-serving-cert\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758775 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gbgc\" (UniqueName: \"kubernetes.io/projected/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-kube-api-access-6gbgc\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758798 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758820 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-encryption-config\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758844 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd13fb01-b6ec-486e-8b39-7440a349ae64-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758867 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c07734d2-f320-4fa6-b259-39862951b066-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758890 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758913 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8hps\" (UniqueName: \"kubernetes.io/projected/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-kube-api-access-d8hps\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758936 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758960 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7llxb\" (UniqueName: \"kubernetes.io/projected/c07734d2-f320-4fa6-b259-39862951b066-kube-api-access-7llxb\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.758984 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f65a9561-f1e7-48f5-ab37-4c59699c0b6f-metrics-tls\") pod \"dns-operator-744455d44c-dpgkt\" (UID: \"f65a9561-f1e7-48f5-ab37-4c59699c0b6f\") " pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759005 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-default-certificate\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759028 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59c0a5ab-f3fc-4842-b79e-8d46d0c479b2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qg65b\" (UID: \"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759076 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-config\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759100 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38ed88a-7879-4394-b102-aa5ad331aa5e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759125 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-trusted-ca\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759152 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a88973e4-6669-4d85-89b6-2d287df271ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759174 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgk6h\" (UniqueName: \"kubernetes.io/projected/a88973e4-6669-4d85-89b6-2d287df271ea-kube-api-access-bgk6h\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759196 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759219 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/323f9188-3789-4e7c-b4d2-17f051188a15-serving-cert\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759241 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759263 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-encryption-config\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759283 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf498c-034c-431c-ae07-4099724a48a7-serving-cert\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759302 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-etcd-client\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759325 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-dir\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759348 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759374 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-config\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759396 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-config\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759420 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsvhl\" (UniqueName: \"kubernetes.io/projected/77393764-0c2d-4822-8558-71f98dbaef2f-kube-api-access-nsvhl\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759444 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-config\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759470 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csg8v\" (UniqueName: \"kubernetes.io/projected/be6f13fb-81da-4176-8540-a3fa61cd7002-kube-api-access-csg8v\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759492 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8d22af-7187-427b-8a5c-24e49c3e96cc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759514 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06f52f14-b54f-4666-9413-e299c6ad0f22-serving-cert\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759537 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759559 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqw8r\" (UniqueName: \"kubernetes.io/projected/96bf498c-034c-431c-ae07-4099724a48a7-kube-api-access-vqw8r\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759578 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fd13fb01-b6ec-486e-8b39-7440a349ae64-images\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759609 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4c5k\" (UniqueName: \"kubernetes.io/projected/323f9188-3789-4e7c-b4d2-17f051188a15-kube-api-access-z4c5k\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759625 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77393764-0c2d-4822-8558-71f98dbaef2f-auth-proxy-config\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759641 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xclcn\" (UniqueName: \"kubernetes.io/projected/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-kube-api-access-xclcn\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759657 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rp8d\" (UniqueName: \"kubernetes.io/projected/87431bdf-f949-4c35-916f-e14903939fe1-kube-api-access-5rp8d\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759676 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbw75\" (UniqueName: \"kubernetes.io/projected/0e392aad-9ae5-4942-a078-8ef9cbaffb90-kube-api-access-rbw75\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759692 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-service-ca-bundle\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759708 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06f52f14-b54f-4666-9413-e299c6ad0f22-config\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759726 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-oauth-config\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759741 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759757 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-stats-auth\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759777 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-trusted-ca-bundle\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759798 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e392aad-9ae5-4942-a078-8ef9cbaffb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759813 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-audit-policies\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759828 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759852 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-audit\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759869 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be6f13fb-81da-4176-8540-a3fa61cd7002-audit-dir\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759891 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c8d22af-7187-427b-8a5c-24e49c3e96cc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759906 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw5bh\" (UniqueName: \"kubernetes.io/projected/06f52f14-b54f-4666-9413-e299c6ad0f22-kube-api-access-hw5bh\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759921 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/77393764-0c2d-4822-8558-71f98dbaef2f-machine-approver-tls\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759936 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-service-ca\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759951 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-policies\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759965 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-etcd-serving-ca\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759981 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-oauth-serving-cert\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.759996 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a88973e4-6669-4d85-89b6-2d287df271ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760012 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06f52f14-b54f-4666-9413-e299c6ad0f22-trusted-ca\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760028 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-metrics-tls\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760029 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760089 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760108 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zh2x\" (UniqueName: \"kubernetes.io/projected/59c0a5ab-f3fc-4842-b79e-8d46d0c479b2-kube-api-access-9zh2x\") pod \"cluster-samples-operator-665b6dd947-qg65b\" (UID: \"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760127 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760144 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/be6f13fb-81da-4176-8540-a3fa61cd7002-node-pullsecrets\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760160 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd13fb01-b6ec-486e-8b39-7440a349ae64-config\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760178 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/323f9188-3789-4e7c-b4d2-17f051188a15-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760194 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-audit-dir\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760210 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crc7n\" (UniqueName: \"kubernetes.io/projected/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-kube-api-access-crc7n\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760227 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-image-import-ca\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760242 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-config\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760258 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760298 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760315 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87431bdf-f949-4c35-916f-e14903939fe1-service-ca-bundle\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760331 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-trusted-ca-bundle\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760347 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760363 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760380 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbvmf\" (UniqueName: \"kubernetes.io/projected/5c8d22af-7187-427b-8a5c-24e49c3e96cc-kube-api-access-vbvmf\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760398 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07734d2-f320-4fa6-b259-39862951b066-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760418 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38ed88a-7879-4394-b102-aa5ad331aa5e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760434 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e38ed88a-7879-4394-b102-aa5ad331aa5e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.760465 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77393764-0c2d-4822-8558-71f98dbaef2f-config\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.762641 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-config\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.767544 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-audit-policies\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.767953 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-audit\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.768081 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-etcd-client\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.768147 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6bqsb"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.768574 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/be6f13fb-81da-4176-8540-a3fa61cd7002-audit-dir\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.769836 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-client-ca\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.770615 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.770818 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.772840 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-etcd-serving-ca\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.773295 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-service-ca\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.773459 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-oauth-serving-cert\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.773846 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-policies\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.774692 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a88973e4-6669-4d85-89b6-2d287df271ea-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.774863 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.774792 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.775630 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/77393764-0c2d-4822-8558-71f98dbaef2f-machine-approver-tls\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.776061 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c8d22af-7187-427b-8a5c-24e49c3e96cc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.776897 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8d22af-7187-427b-8a5c-24e49c3e96cc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.777160 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.778898 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fd13fb01-b6ec-486e-8b39-7440a349ae64-images\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.779440 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77393764-0c2d-4822-8558-71f98dbaef2f-auth-proxy-config\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.781638 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a88973e4-6669-4d85-89b6-2d287df271ea-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.782300 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-serving-cert\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.782615 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06f52f14-b54f-4666-9413-e299c6ad0f22-trusted-ca\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.783152 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-dir\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.785881 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/be6f13fb-81da-4176-8540-a3fa61cd7002-node-pullsecrets\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.785979 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-client-ca\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.786458 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-service-ca-bundle\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.786536 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd13fb01-b6ec-486e-8b39-7440a349ae64-config\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.786866 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/323f9188-3789-4e7c-b4d2-17f051188a15-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.786924 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-audit-dir\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.787026 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06f52f14-b54f-4666-9413-e299c6ad0f22-config\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.787543 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.787585 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dpgkt"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.787616 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h6f2z"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.787740 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-image-import-ca\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.788249 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-config\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.788894 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-serving-cert\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.789023 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.789753 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.791054 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd13fb01-b6ec-486e-8b39-7440a349ae64-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.792567 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.792567 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-trusted-ca-bundle\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.792733 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38ed88a-7879-4394-b102-aa5ad331aa5e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.792809 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.792909 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-encryption-config\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.793631 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.794699 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.797076 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06f52f14-b54f-4666-9413-e299c6ad0f22-serving-cert\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.797103 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-oauth-config\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.797172 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-trusted-ca-bundle\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.797408 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.797482 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/323f9188-3789-4e7c-b4d2-17f051188a15-serving-cert\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.798206 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-config\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.799677 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.800130 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be6f13fb-81da-4176-8540-a3fa61cd7002-config\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.800653 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38ed88a-7879-4394-b102-aa5ad331aa5e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.802239 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-config\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.803252 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.805157 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.806306 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-encryption-config\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.809325 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.809716 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf498c-034c-431c-ae07-4099724a48a7-serving-cert\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.810328 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be6f13fb-81da-4176-8540-a3fa61cd7002-serving-cert\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.811899 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e392aad-9ae5-4942-a078-8ef9cbaffb90-serving-cert\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.812090 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-etcd-client\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.812489 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-serving-cert\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.813167 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.813479 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.816283 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.816767 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.826898 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.830984 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.834115 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.836111 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2chmh"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.838198 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6bpj6"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.839238 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nft69"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.840998 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.841890 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.844543 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.848025 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvmr2"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.849228 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.852538 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.852871 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.854287 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.855021 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pncrd"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.857158 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.860177 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861126 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87431bdf-f949-4c35-916f-e14903939fe1-service-ca-bundle\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861175 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07734d2-f320-4fa6-b259-39862951b066-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861216 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-metrics-certs\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861239 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvw2z\" (UniqueName: \"kubernetes.io/projected/f65a9561-f1e7-48f5-ab37-4c59699c0b6f-kube-api-access-qvw2z\") pod \"dns-operator-744455d44c-dpgkt\" (UID: \"f65a9561-f1e7-48f5-ab37-4c59699c0b6f\") " pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861298 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c07734d2-f320-4fa6-b259-39862951b066-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861329 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7llxb\" (UniqueName: \"kubernetes.io/projected/c07734d2-f320-4fa6-b259-39862951b066-kube-api-access-7llxb\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861358 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f65a9561-f1e7-48f5-ab37-4c59699c0b6f-metrics-tls\") pod \"dns-operator-744455d44c-dpgkt\" (UID: \"f65a9561-f1e7-48f5-ab37-4c59699c0b6f\") " pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861378 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-default-certificate\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861399 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59c0a5ab-f3fc-4842-b79e-8d46d0c479b2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qg65b\" (UID: \"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861421 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-trusted-ca\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861454 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861502 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rp8d\" (UniqueName: \"kubernetes.io/projected/87431bdf-f949-4c35-916f-e14903939fe1-kube-api-access-5rp8d\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861588 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xclcn\" (UniqueName: \"kubernetes.io/projected/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-kube-api-access-xclcn\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861622 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-stats-auth\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861706 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-metrics-tls\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.861734 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zh2x\" (UniqueName: \"kubernetes.io/projected/59c0a5ab-f3fc-4842-b79e-8d46d0c479b2-kube-api-access-9zh2x\") pod \"cluster-samples-operator-665b6dd947-qg65b\" (UID: \"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.862760 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c07734d2-f320-4fa6-b259-39862951b066-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.863443 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.863474 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.866362 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vm8xl"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.868638 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f65a9561-f1e7-48f5-ab37-4c59699c0b6f-metrics-tls\") pod \"dns-operator-744455d44c-dpgkt\" (UID: \"f65a9561-f1e7-48f5-ab37-4c59699c0b6f\") " pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.869827 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.869995 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c07734d2-f320-4fa6-b259-39862951b066-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.871638 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-656jw"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.872597 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-656jw" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.873216 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-656jw"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.874459 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lsmgs"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.875162 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lsmgs" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.876543 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lsmgs"] Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.889884 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.909748 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.929585 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.938614 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-metrics-tls\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.959813 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.966392 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-trusted-ca\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.970303 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 19:34:39 crc kubenswrapper[4553]: I0930 19:34:39.989526 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.010353 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.030191 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.039385 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/59c0a5ab-f3fc-4842-b79e-8d46d0c479b2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qg65b\" (UID: \"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.050155 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.070212 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.090127 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.110260 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.130761 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.137307 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-stats-auth\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.149742 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.155862 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-metrics-certs\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.170350 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.190430 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.199401 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/87431bdf-f949-4c35-916f-e14903939fe1-default-certificate\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.209623 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.214252 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87431bdf-f949-4c35-916f-e14903939fe1-service-ca-bundle\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.229473 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.250656 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.269876 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.290591 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.310828 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.330148 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.349127 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.371645 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.392659 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.410650 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.450467 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.471644 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.491117 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.510686 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.530954 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.550516 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.570821 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.591573 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.610821 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.632272 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.651459 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.670902 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.691111 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.711006 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.728421 4553 request.go:700] Waited for 1.015733827s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-controller-dockercfg-c2lfx&limit=500&resourceVersion=0 Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.730442 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.750383 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.770335 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.791148 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.812021 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.830996 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.850312 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.871861 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.889812 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.911100 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.930320 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.950112 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.970856 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 19:34:40 crc kubenswrapper[4553]: I0930 19:34:40.990830 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.011120 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.030934 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.050550 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.070356 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.090955 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.110684 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.131856 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.150397 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.171274 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.191914 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.211330 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.245227 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.251172 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.271330 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.290904 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.310009 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.330809 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.349876 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.370991 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.390519 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.410605 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.431087 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.452133 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.469899 4553 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.491212 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.544074 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5w8z\" (UniqueName: \"kubernetes.io/projected/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-kube-api-access-b5w8z\") pod \"oauth-openshift-558db77b4-2chmh\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.567789 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e38ed88a-7879-4394-b102-aa5ad331aa5e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7gg7n\" (UID: \"e38ed88a-7879-4394-b102-aa5ad331aa5e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.589901 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gbgc\" (UniqueName: \"kubernetes.io/projected/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-kube-api-access-6gbgc\") pod \"console-f9d7485db-6csmn\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.602520 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw5bh\" (UniqueName: \"kubernetes.io/projected/06f52f14-b54f-4666-9413-e299c6ad0f22-kube-api-access-hw5bh\") pod \"console-operator-58897d9998-6bqsb\" (UID: \"06f52f14-b54f-4666-9413-e299c6ad0f22\") " pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.617159 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csg8v\" (UniqueName: \"kubernetes.io/projected/be6f13fb-81da-4176-8540-a3fa61cd7002-kube-api-access-csg8v\") pod \"apiserver-76f77b778f-djhpv\" (UID: \"be6f13fb-81da-4176-8540-a3fa61cd7002\") " pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.630673 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a88973e4-6669-4d85-89b6-2d287df271ea-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.647674 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.660412 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.669567 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgk6h\" (UniqueName: \"kubernetes.io/projected/a88973e4-6669-4d85-89b6-2d287df271ea-kube-api-access-bgk6h\") pod \"cluster-image-registry-operator-dc59b4c8b-4lrff\" (UID: \"a88973e4-6669-4d85-89b6-2d287df271ea\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.680810 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8hps\" (UniqueName: \"kubernetes.io/projected/b23246e9-901f-436d-b8c4-d9ffc47dc3a7-kube-api-access-d8hps\") pod \"authentication-operator-69f744f599-rgm8h\" (UID: \"b23246e9-901f-436d-b8c4-d9ffc47dc3a7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.686566 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.686923 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqw8r\" (UniqueName: \"kubernetes.io/projected/96bf498c-034c-431c-ae07-4099724a48a7-kube-api-access-vqw8r\") pod \"controller-manager-879f6c89f-5dq4n\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.693364 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.710974 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4c5k\" (UniqueName: \"kubernetes.io/projected/323f9188-3789-4e7c-b4d2-17f051188a15-kube-api-access-z4c5k\") pod \"openshift-config-operator-7777fb866f-rddkb\" (UID: \"323f9188-3789-4e7c-b4d2-17f051188a15\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.711471 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.738792 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.745555 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbw75\" (UniqueName: \"kubernetes.io/projected/0e392aad-9ae5-4942-a078-8ef9cbaffb90-kube-api-access-rbw75\") pod \"route-controller-manager-6576b87f9c-6tq2m\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.748742 4553 request.go:700] Waited for 1.961719446s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.750521 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl577\" (UniqueName: \"kubernetes.io/projected/76d4f83a-1f82-4374-bc4d-601f752d318d-kube-api-access-xl577\") pod \"downloads-7954f5f757-j2fv9\" (UID: \"76d4f83a-1f82-4374-bc4d-601f752d318d\") " pod="openshift-console/downloads-7954f5f757-j2fv9" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.773907 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crc7n\" (UniqueName: \"kubernetes.io/projected/19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc-kube-api-access-crc7n\") pod \"apiserver-7bbb656c7d-vlpg8\" (UID: \"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.786658 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.792319 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcm9b\" (UniqueName: \"kubernetes.io/projected/fd13fb01-b6ec-486e-8b39-7440a349ae64-kube-api-access-wcm9b\") pod \"machine-api-operator-5694c8668f-z7r6d\" (UID: \"fd13fb01-b6ec-486e-8b39-7440a349ae64\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.802262 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.823857 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbvmf\" (UniqueName: \"kubernetes.io/projected/5c8d22af-7187-427b-8a5c-24e49c3e96cc-kube-api-access-vbvmf\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqs7\" (UID: \"5c8d22af-7187-427b-8a5c-24e49c3e96cc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.829919 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.840144 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.843026 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsvhl\" (UniqueName: \"kubernetes.io/projected/77393764-0c2d-4822-8558-71f98dbaef2f-kube-api-access-nsvhl\") pod \"machine-approver-56656f9798-gf6g9\" (UID: \"77393764-0c2d-4822-8558-71f98dbaef2f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.848246 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zh2x\" (UniqueName: \"kubernetes.io/projected/59c0a5ab-f3fc-4842-b79e-8d46d0c479b2-kube-api-access-9zh2x\") pod \"cluster-samples-operator-665b6dd947-qg65b\" (UID: \"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.860492 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.864629 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvw2z\" (UniqueName: \"kubernetes.io/projected/f65a9561-f1e7-48f5-ab37-4c59699c0b6f-kube-api-access-qvw2z\") pod \"dns-operator-744455d44c-dpgkt\" (UID: \"f65a9561-f1e7-48f5-ab37-4c59699c0b6f\") " pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.896518 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7llxb\" (UniqueName: \"kubernetes.io/projected/c07734d2-f320-4fa6-b259-39862951b066-kube-api-access-7llxb\") pod \"openshift-apiserver-operator-796bbdcf4f-xbl8j\" (UID: \"c07734d2-f320-4fa6-b259-39862951b066\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.915778 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xclcn\" (UniqueName: \"kubernetes.io/projected/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-kube-api-access-xclcn\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.926024 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1525a24-eab3-489e-b3c2-2ab74a2c5a60-bound-sa-token\") pod \"ingress-operator-5b745b69d9-sxdfp\" (UID: \"d1525a24-eab3-489e-b3c2-2ab74a2c5a60\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.949207 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rp8d\" (UniqueName: \"kubernetes.io/projected/87431bdf-f949-4c35-916f-e14903939fe1-kube-api-access-5rp8d\") pod \"router-default-5444994796-r22xf\" (UID: \"87431bdf-f949-4c35-916f-e14903939fe1\") " pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.968333 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-j2fv9" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.969418 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.987535 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" Sep 30 19:34:41 crc kubenswrapper[4553]: I0930 19:34:41.990046 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.002300 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.005471 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.010972 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.021371 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.024743 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.033538 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.033865 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.050972 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.055821 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.060306 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.071267 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.090889 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202206 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202311 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f298352-909f-4017-ae2a-ac8deda23167-proxy-tls\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202332 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-bound-sa-token\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202375 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-tls\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202394 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfks5\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-kube-api-access-jfks5\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202410 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50e7e6b4-78bd-4209-bf3e-7c27662763fd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202424 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-config\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202441 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-ca\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202487 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdpsn\" (UniqueName: \"kubernetes.io/projected/d29d9e48-584a-4c95-a9b7-1039a800071e-kube-api-access-sdpsn\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202503 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d29d9e48-584a-4c95-a9b7-1039a800071e-serving-cert\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202538 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f298352-909f-4017-ae2a-ac8deda23167-images\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202577 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-client\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202613 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vm6j\" (UniqueName: \"kubernetes.io/projected/5f298352-909f-4017-ae2a-ac8deda23167-kube-api-access-7vm6j\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202633 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-trusted-ca\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202649 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50e7e6b4-78bd-4209-bf3e-7c27662763fd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202728 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-certificates\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202756 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f298352-909f-4017-ae2a-ac8deda23167-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.202771 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-service-ca\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.203371 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:42.703358255 +0000 UTC m=+135.902860385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.303695 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.303842 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:42.803812835 +0000 UTC m=+136.003314965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.303907 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0786ae0-90bb-446a-96ae-5b522f776e0b-apiservice-cert\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.303971 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304111 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8efd871e-42ac-408d-9a1f-3635cb099a4b-config-volume\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304185 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-trusted-ca\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304213 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94c26543-8110-4758-8860-54fe4f1349aa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r49lm\" (UID: \"94c26543-8110-4758-8860-54fe4f1349aa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304241 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbdxj\" (UniqueName: \"kubernetes.io/projected/5a5f65f6-0be4-40dd-be78-16de7ada0614-kube-api-access-lbdxj\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304362 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0152fc34-26e8-4555-bb44-227eb61394b6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304585 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50e7e6b4-78bd-4209-bf3e-7c27662763fd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304654 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58c5f2ca-b4da-444a-8ecc-22c758896df9-signing-key\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304675 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-plugins-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304700 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9202a914-8e69-4d7e-9337-eff52d5c4bd6-cert\") pod \"ingress-canary-lsmgs\" (UID: \"9202a914-8e69-4d7e-9337-eff52d5c4bd6\") " pod="openshift-ingress-canary/ingress-canary-lsmgs" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304760 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066d34b5-1b98-40a8-98a5-59f7edcc43e4-config\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304779 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kxb\" (UniqueName: \"kubernetes.io/projected/8cb17eb3-73f0-4235-9bf8-11723b544e6e-kube-api-access-g5kxb\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304815 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f416f93e-65d8-49ae-a035-40bf9857525a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304834 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f416f93e-65d8-49ae-a035-40bf9857525a-config\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304925 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4ng\" (UniqueName: \"kubernetes.io/projected/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-kube-api-access-hg4ng\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304952 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ffff74f-1337-47da-907a-f0e10382509d-config-volume\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.304974 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0152fc34-26e8-4555-bb44-227eb61394b6-srv-cert\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305007 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-service-ca\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305031 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nmwg\" (UniqueName: \"kubernetes.io/projected/0152fc34-26e8-4555-bb44-227eb61394b6-kube-api-access-9nmwg\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305610 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-mountpoint-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305653 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305674 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305711 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305754 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m4fw\" (UniqueName: \"kubernetes.io/projected/ade6bc35-f568-4e61-b273-dbc590e64141-kube-api-access-8m4fw\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305794 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwjr8\" (UniqueName: \"kubernetes.io/projected/70ef651c-3015-42e3-9d1f-df3833f6343a-kube-api-access-cwjr8\") pod \"multus-admission-controller-857f4d67dd-pncrd\" (UID: \"70ef651c-3015-42e3-9d1f-df3833f6343a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305810 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-csi-data-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305840 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066d34b5-1b98-40a8-98a5-59f7edcc43e4-serving-cert\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305855 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efd871e-42ac-408d-9a1f-3635cb099a4b-metrics-tls\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305874 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-registration-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305891 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/51d0f112-9b15-4602-b8a6-16e79dfeb4cb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xz69w\" (UID: \"51d0f112-9b15-4602-b8a6-16e79dfeb4cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305930 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ffff74f-1337-47da-907a-f0e10382509d-secret-volume\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305957 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-config\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.305986 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-ca\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306006 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/056396e2-e9e7-4bb6-8be4-221c931d490e-certs\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306058 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtnnb\" (UniqueName: \"kubernetes.io/projected/38703e61-9d3b-4d8d-aae8-9740c0948ceb-kube-api-access-qtnnb\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306074 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0786ae0-90bb-446a-96ae-5b522f776e0b-webhook-cert\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306101 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f298352-909f-4017-ae2a-ac8deda23167-images\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306118 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-socket-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306155 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-client\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306369 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vm6j\" (UniqueName: \"kubernetes.io/projected/5f298352-909f-4017-ae2a-ac8deda23167-kube-api-access-7vm6j\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306412 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70ef651c-3015-42e3-9d1f-df3833f6343a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pncrd\" (UID: \"70ef651c-3015-42e3-9d1f-df3833f6343a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306449 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5f65f6-0be4-40dd-be78-16de7ada0614-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306466 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58c5f2ca-b4da-444a-8ecc-22c758896df9-signing-cabundle\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306484 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqcwr\" (UniqueName: \"kubernetes.io/projected/9202a914-8e69-4d7e-9337-eff52d5c4bd6-kube-api-access-cqcwr\") pod \"ingress-canary-lsmgs\" (UID: \"9202a914-8e69-4d7e-9337-eff52d5c4bd6\") " pod="openshift-ingress-canary/ingress-canary-lsmgs" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306501 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srrt\" (UniqueName: \"kubernetes.io/projected/94c26543-8110-4758-8860-54fe4f1349aa-kube-api-access-5srrt\") pod \"package-server-manager-789f6589d5-r49lm\" (UID: \"94c26543-8110-4758-8860-54fe4f1349aa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306515 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8sqv\" (UniqueName: \"kubernetes.io/projected/e0786ae0-90bb-446a-96ae-5b522f776e0b-kube-api-access-k8sqv\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306577 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ade6bc35-f568-4e61-b273-dbc590e64141-profile-collector-cert\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306611 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306647 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-certificates\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306672 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f298352-909f-4017-ae2a-ac8deda23167-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306700 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvpgj\" (UniqueName: \"kubernetes.io/projected/58c5f2ca-b4da-444a-8ecc-22c758896df9-kube-api-access-lvpgj\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306716 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f416f93e-65d8-49ae-a035-40bf9857525a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306738 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92k7m\" (UniqueName: \"kubernetes.io/projected/066d34b5-1b98-40a8-98a5-59f7edcc43e4-kube-api-access-92k7m\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306798 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc58j\" (UniqueName: \"kubernetes.io/projected/8efd871e-42ac-408d-9a1f-3635cb099a4b-kube-api-access-nc58j\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306814 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7gdg\" (UniqueName: \"kubernetes.io/projected/51d0f112-9b15-4602-b8a6-16e79dfeb4cb-kube-api-access-n7gdg\") pod \"control-plane-machine-set-operator-78cbb6b69f-xz69w\" (UID: \"51d0f112-9b15-4602-b8a6-16e79dfeb4cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306832 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f298352-909f-4017-ae2a-ac8deda23167-proxy-tls\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306857 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-bound-sa-token\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306872 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306888 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ade6bc35-f568-4e61-b273-dbc590e64141-srv-cert\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306904 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-proxy-tls\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306940 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/056396e2-e9e7-4bb6-8be4-221c931d490e-node-bootstrap-token\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306957 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x8hm\" (UniqueName: \"kubernetes.io/projected/bf4bad02-1539-46e4-a436-5c4f2c94fda1-kube-api-access-5x8hm\") pod \"migrator-59844c95c7-jbl46\" (UID: \"bf4bad02-1539-46e4-a436-5c4f2c94fda1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306975 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfks5\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-kube-api-access-jfks5\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.306993 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a5f65f6-0be4-40dd-be78-16de7ada0614-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.307022 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-tls\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.307057 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2rkm\" (UniqueName: \"kubernetes.io/projected/0ffff74f-1337-47da-907a-f0e10382509d-kube-api-access-h2rkm\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.307085 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50e7e6b4-78bd-4209-bf3e-7c27662763fd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.307102 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-config\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.307211 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-trusted-ca\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.307925 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50e7e6b4-78bd-4209-bf3e-7c27662763fd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.308515 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-service-ca\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.309348 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e0786ae0-90bb-446a-96ae-5b522f776e0b-tmpfs\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.309385 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d29d9e48-584a-4c95-a9b7-1039a800071e-serving-cert\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.309404 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdpsn\" (UniqueName: \"kubernetes.io/projected/d29d9e48-584a-4c95-a9b7-1039a800071e-kube-api-access-sdpsn\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.309479 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc6cd\" (UniqueName: \"kubernetes.io/projected/056396e2-e9e7-4bb6-8be4-221c931d490e-kube-api-access-cc6cd\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.309546 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:42.809534729 +0000 UTC m=+136.009036859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.310757 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-certificates\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.311396 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f298352-909f-4017-ae2a-ac8deda23167-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.313355 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-config\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.313836 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-ca\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.319188 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5f298352-909f-4017-ae2a-ac8deda23167-images\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.327674 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f298352-909f-4017-ae2a-ac8deda23167-proxy-tls\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.334677 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50e7e6b4-78bd-4209-bf3e-7c27662763fd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.338641 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d29d9e48-584a-4c95-a9b7-1039a800071e-serving-cert\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.359269 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.359763 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d29d9e48-584a-4c95-a9b7-1039a800071e-etcd-client\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.359881 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-tls\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.371940 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.373808 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2chmh"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.386563 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdpsn\" (UniqueName: \"kubernetes.io/projected/d29d9e48-584a-4c95-a9b7-1039a800071e-kube-api-access-sdpsn\") pod \"etcd-operator-b45778765-nft69\" (UID: \"d29d9e48-584a-4c95-a9b7-1039a800071e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.394277 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" event={"ID":"77393764-0c2d-4822-8558-71f98dbaef2f","Type":"ContainerStarted","Data":"7c5870f5fc9f19bdb0f1945e59e34eab1d7ba4782d75e54d602ebe874137986f"} Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.397597 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r22xf" event={"ID":"87431bdf-f949-4c35-916f-e14903939fe1","Type":"ContainerStarted","Data":"a4471d211fb37816a4661798e1769a2216b94add01da108f3897cb21b30f11d1"} Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.401799 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6bqsb"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.402540 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5dq4n"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.409977 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.410816 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5f65f6-0be4-40dd-be78-16de7ada0614-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.410906 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58c5f2ca-b4da-444a-8ecc-22c758896df9-signing-cabundle\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.410978 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqcwr\" (UniqueName: \"kubernetes.io/projected/9202a914-8e69-4d7e-9337-eff52d5c4bd6-kube-api-access-cqcwr\") pod \"ingress-canary-lsmgs\" (UID: \"9202a914-8e69-4d7e-9337-eff52d5c4bd6\") " pod="openshift-ingress-canary/ingress-canary-lsmgs" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.411097 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5srrt\" (UniqueName: \"kubernetes.io/projected/94c26543-8110-4758-8860-54fe4f1349aa-kube-api-access-5srrt\") pod \"package-server-manager-789f6589d5-r49lm\" (UID: \"94c26543-8110-4758-8860-54fe4f1349aa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.411183 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8sqv\" (UniqueName: \"kubernetes.io/projected/e0786ae0-90bb-446a-96ae-5b522f776e0b-kube-api-access-k8sqv\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.411258 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ade6bc35-f568-4e61-b273-dbc590e64141-profile-collector-cert\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.411346 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.411454 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f416f93e-65d8-49ae-a035-40bf9857525a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.411956 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvpgj\" (UniqueName: \"kubernetes.io/projected/58c5f2ca-b4da-444a-8ecc-22c758896df9-kube-api-access-lvpgj\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412157 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92k7m\" (UniqueName: \"kubernetes.io/projected/066d34b5-1b98-40a8-98a5-59f7edcc43e4-kube-api-access-92k7m\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412249 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc58j\" (UniqueName: \"kubernetes.io/projected/8efd871e-42ac-408d-9a1f-3635cb099a4b-kube-api-access-nc58j\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412327 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7gdg\" (UniqueName: \"kubernetes.io/projected/51d0f112-9b15-4602-b8a6-16e79dfeb4cb-kube-api-access-n7gdg\") pod \"control-plane-machine-set-operator-78cbb6b69f-xz69w\" (UID: \"51d0f112-9b15-4602-b8a6-16e79dfeb4cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412402 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ade6bc35-f568-4e61-b273-dbc590e64141-srv-cert\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412489 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412560 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x8hm\" (UniqueName: \"kubernetes.io/projected/bf4bad02-1539-46e4-a436-5c4f2c94fda1-kube-api-access-5x8hm\") pod \"migrator-59844c95c7-jbl46\" (UID: \"bf4bad02-1539-46e4-a436-5c4f2c94fda1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412636 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-proxy-tls\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412710 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/056396e2-e9e7-4bb6-8be4-221c931d490e-node-bootstrap-token\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412794 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a5f65f6-0be4-40dd-be78-16de7ada0614-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412872 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2rkm\" (UniqueName: \"kubernetes.io/projected/0ffff74f-1337-47da-907a-f0e10382509d-kube-api-access-h2rkm\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.412952 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-config\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.413020 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e0786ae0-90bb-446a-96ae-5b522f776e0b-tmpfs\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.413368 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc6cd\" (UniqueName: \"kubernetes.io/projected/056396e2-e9e7-4bb6-8be4-221c931d490e-kube-api-access-cc6cd\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.413464 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0786ae0-90bb-446a-96ae-5b522f776e0b-apiservice-cert\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.417670 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.417873 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8efd871e-42ac-408d-9a1f-3635cb099a4b-config-volume\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.417945 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94c26543-8110-4758-8860-54fe4f1349aa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r49lm\" (UID: \"94c26543-8110-4758-8860-54fe4f1349aa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418076 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbdxj\" (UniqueName: \"kubernetes.io/projected/5a5f65f6-0be4-40dd-be78-16de7ada0614-kube-api-access-lbdxj\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418161 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0152fc34-26e8-4555-bb44-227eb61394b6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418233 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-plugins-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418305 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58c5f2ca-b4da-444a-8ecc-22c758896df9-signing-key\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418383 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9202a914-8e69-4d7e-9337-eff52d5c4bd6-cert\") pod \"ingress-canary-lsmgs\" (UID: \"9202a914-8e69-4d7e-9337-eff52d5c4bd6\") " pod="openshift-ingress-canary/ingress-canary-lsmgs" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418469 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066d34b5-1b98-40a8-98a5-59f7edcc43e4-config\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418540 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kxb\" (UniqueName: \"kubernetes.io/projected/8cb17eb3-73f0-4235-9bf8-11723b544e6e-kube-api-access-g5kxb\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418642 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f416f93e-65d8-49ae-a035-40bf9857525a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418735 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f416f93e-65d8-49ae-a035-40bf9857525a-config\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418818 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0152fc34-26e8-4555-bb44-227eb61394b6-srv-cert\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418890 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4ng\" (UniqueName: \"kubernetes.io/projected/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-kube-api-access-hg4ng\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.418964 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ffff74f-1337-47da-907a-f0e10382509d-config-volume\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419055 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-mountpoint-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419265 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nmwg\" (UniqueName: \"kubernetes.io/projected/0152fc34-26e8-4555-bb44-227eb61394b6-kube-api-access-9nmwg\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419396 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419494 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419581 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m4fw\" (UniqueName: \"kubernetes.io/projected/ade6bc35-f568-4e61-b273-dbc590e64141-kube-api-access-8m4fw\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419711 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwjr8\" (UniqueName: \"kubernetes.io/projected/70ef651c-3015-42e3-9d1f-df3833f6343a-kube-api-access-cwjr8\") pod \"multus-admission-controller-857f4d67dd-pncrd\" (UID: \"70ef651c-3015-42e3-9d1f-df3833f6343a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419797 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-csi-data-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419899 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066d34b5-1b98-40a8-98a5-59f7edcc43e4-serving-cert\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.419969 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efd871e-42ac-408d-9a1f-3635cb099a4b-metrics-tls\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.420093 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-registration-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.420183 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/51d0f112-9b15-4602-b8a6-16e79dfeb4cb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xz69w\" (UID: \"51d0f112-9b15-4602-b8a6-16e79dfeb4cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.420274 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ffff74f-1337-47da-907a-f0e10382509d-secret-volume\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.420355 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/056396e2-e9e7-4bb6-8be4-221c931d490e-certs\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.420434 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtnnb\" (UniqueName: \"kubernetes.io/projected/38703e61-9d3b-4d8d-aae8-9740c0948ceb-kube-api-access-qtnnb\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.420513 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-socket-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.420609 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0786ae0-90bb-446a-96ae-5b522f776e0b-webhook-cert\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.436899 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70ef651c-3015-42e3-9d1f-df3833f6343a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pncrd\" (UID: \"70ef651c-3015-42e3-9d1f-df3833f6343a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.441371 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.445751 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vm6j\" (UniqueName: \"kubernetes.io/projected/5f298352-909f-4017-ae2a-ac8deda23167-kube-api-access-7vm6j\") pod \"machine-config-operator-74547568cd-qhwcx\" (UID: \"5f298352-909f-4017-ae2a-ac8deda23167\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.459935 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/70ef651c-3015-42e3-9d1f-df3833f6343a-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pncrd\" (UID: \"70ef651c-3015-42e3-9d1f-df3833f6343a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.414248 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-bound-sa-token\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.462274 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-mountpoint-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.464805 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f416f93e-65d8-49ae-a035-40bf9857525a-config\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.465332 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rgm8h"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.467140 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6csmn"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.467231 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-djhpv"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.465608 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5f65f6-0be4-40dd-be78-16de7ada0614-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.469704 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-csi-data-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.466216 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58c5f2ca-b4da-444a-8ecc-22c758896df9-signing-cabundle\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.477033 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ffff74f-1337-47da-907a-f0e10382509d-config-volume\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.486889 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-config\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.487004 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:42.986972741 +0000 UTC m=+136.186474871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.487365 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e0786ae0-90bb-446a-96ae-5b522f776e0b-tmpfs\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.497030 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/056396e2-e9e7-4bb6-8be4-221c931d490e-node-bootstrap-token\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.502084 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8efd871e-42ac-408d-9a1f-3635cb099a4b-config-volume\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.502948 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ade6bc35-f568-4e61-b273-dbc590e64141-profile-collector-cert\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.506442 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.526805 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-j2fv9"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.529685 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfks5\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-kube-api-access-jfks5\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.535429 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-plugins-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.536892 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a5f65f6-0be4-40dd-be78-16de7ada0614-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.536919 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0786ae0-90bb-446a-96ae-5b522f776e0b-webhook-cert\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.537261 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066d34b5-1b98-40a8-98a5-59f7edcc43e4-serving-cert\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.542955 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.547153 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94c26543-8110-4758-8860-54fe4f1349aa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-r49lm\" (UID: \"94c26543-8110-4758-8860-54fe4f1349aa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.562484 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0786ae0-90bb-446a-96ae-5b522f776e0b-apiservice-cert\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.563998 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwjr8\" (UniqueName: \"kubernetes.io/projected/70ef651c-3015-42e3-9d1f-df3833f6343a-kube-api-access-cwjr8\") pod \"multus-admission-controller-857f4d67dd-pncrd\" (UID: \"70ef651c-3015-42e3-9d1f-df3833f6343a\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.569594 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.566423 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-proxy-tls\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.566740 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-registration-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.567599 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0152fc34-26e8-4555-bb44-227eb61394b6-srv-cert\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.568275 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqcwr\" (UniqueName: \"kubernetes.io/projected/9202a914-8e69-4d7e-9337-eff52d5c4bd6-kube-api-access-cqcwr\") pod \"ingress-canary-lsmgs\" (UID: \"9202a914-8e69-4d7e-9337-eff52d5c4bd6\") " pod="openshift-ingress-canary/ingress-canary-lsmgs" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.569399 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.564599 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066d34b5-1b98-40a8-98a5-59f7edcc43e4-config\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.570778 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.070759315 +0000 UTC m=+136.270261445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.572166 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf81a909-a6ef-46c2-9d8f-ec12c2c748f4-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bnsph\" (UID: \"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.572497 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nmwg\" (UniqueName: \"kubernetes.io/projected/0152fc34-26e8-4555-bb44-227eb61394b6-kube-api-access-9nmwg\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.573242 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9202a914-8e69-4d7e-9337-eff52d5c4bd6-cert\") pod \"ingress-canary-lsmgs\" (UID: \"9202a914-8e69-4d7e-9337-eff52d5c4bd6\") " pod="openshift-ingress-canary/ingress-canary-lsmgs" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.576351 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0152fc34-26e8-4555-bb44-227eb61394b6-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cscrv\" (UID: \"0152fc34-26e8-4555-bb44-227eb61394b6\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.577912 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f416f93e-65d8-49ae-a035-40bf9857525a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.578277 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ade6bc35-f568-4e61-b273-dbc590e64141-srv-cert\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.579509 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8efd871e-42ac-408d-9a1f-3635cb099a4b-metrics-tls\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.581685 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58c5f2ca-b4da-444a-8ecc-22c758896df9-signing-key\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.582194 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.586608 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.591827 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/51d0f112-9b15-4602-b8a6-16e79dfeb4cb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xz69w\" (UID: \"51d0f112-9b15-4602-b8a6-16e79dfeb4cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.594103 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5srrt\" (UniqueName: \"kubernetes.io/projected/94c26543-8110-4758-8860-54fe4f1349aa-kube-api-access-5srrt\") pod \"package-server-manager-789f6589d5-r49lm\" (UID: \"94c26543-8110-4758-8860-54fe4f1349aa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.596367 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ffff74f-1337-47da-907a-f0e10382509d-secret-volume\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.597593 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cb17eb3-73f0-4235-9bf8-11723b544e6e-socket-dir\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.601197 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8sqv\" (UniqueName: \"kubernetes.io/projected/e0786ae0-90bb-446a-96ae-5b522f776e0b-kube-api-access-k8sqv\") pod \"packageserver-d55dfcdfc-8f88f\" (UID: \"e0786ae0-90bb-446a-96ae-5b522f776e0b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.601725 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/056396e2-e9e7-4bb6-8be4-221c931d490e-certs\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.614420 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f416f93e-65d8-49ae-a035-40bf9857525a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bz92f\" (UID: \"f416f93e-65d8-49ae-a035-40bf9857525a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.620176 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.622847 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4ng\" (UniqueName: \"kubernetes.io/projected/ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8-kube-api-access-hg4ng\") pod \"machine-config-controller-84d6567774-88nf5\" (UID: \"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.639208 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2rkm\" (UniqueName: \"kubernetes.io/projected/0ffff74f-1337-47da-907a-f0e10382509d-kube-api-access-h2rkm\") pod \"collect-profiles-29321010-wlr9z\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.643824 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.649442 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc6cd\" (UniqueName: \"kubernetes.io/projected/056396e2-e9e7-4bb6-8be4-221c931d490e-kube-api-access-cc6cd\") pod \"machine-config-server-9xzz2\" (UID: \"056396e2-e9e7-4bb6-8be4-221c931d490e\") " pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.649969 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.656161 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.670839 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.671717 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.674482 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.174449812 +0000 UTC m=+136.373951942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.677648 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbdxj\" (UniqueName: \"kubernetes.io/projected/5a5f65f6-0be4-40dd-be78-16de7ada0614-kube-api-access-lbdxj\") pod \"kube-storage-version-migrator-operator-b67b599dd-254cf\" (UID: \"5a5f65f6-0be4-40dd-be78-16de7ada0614\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.684918 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.710497 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.725445 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.729883 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7gdg\" (UniqueName: \"kubernetes.io/projected/51d0f112-9b15-4602-b8a6-16e79dfeb4cb-kube-api-access-n7gdg\") pod \"control-plane-machine-set-operator-78cbb6b69f-xz69w\" (UID: \"51d0f112-9b15-4602-b8a6-16e79dfeb4cb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.733390 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.736590 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92k7m\" (UniqueName: \"kubernetes.io/projected/066d34b5-1b98-40a8-98a5-59f7edcc43e4-kube-api-access-92k7m\") pod \"service-ca-operator-777779d784-4zxpw\" (UID: \"066d34b5-1b98-40a8-98a5-59f7edcc43e4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.745634 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.748347 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.753670 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvpgj\" (UniqueName: \"kubernetes.io/projected/58c5f2ca-b4da-444a-8ecc-22c758896df9-kube-api-access-lvpgj\") pod \"service-ca-9c57cc56f-h6f2z\" (UID: \"58c5f2ca-b4da-444a-8ecc-22c758896df9\") " pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.764248 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc58j\" (UniqueName: \"kubernetes.io/projected/8efd871e-42ac-408d-9a1f-3635cb099a4b-kube-api-access-nc58j\") pod \"dns-default-656jw\" (UID: \"8efd871e-42ac-408d-9a1f-3635cb099a4b\") " pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.766759 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9xzz2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.776212 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.776779 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.276761103 +0000 UTC m=+136.476263233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.777540 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.784204 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.790337 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x8hm\" (UniqueName: \"kubernetes.io/projected/bf4bad02-1539-46e4-a436-5c4f2c94fda1-kube-api-access-5x8hm\") pod \"migrator-59844c95c7-jbl46\" (UID: \"bf4bad02-1539-46e4-a436-5c4f2c94fda1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.805072 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kxb\" (UniqueName: \"kubernetes.io/projected/8cb17eb3-73f0-4235-9bf8-11723b544e6e-kube-api-access-g5kxb\") pod \"csi-hostpathplugin-vm8xl\" (UID: \"8cb17eb3-73f0-4235-9bf8-11723b544e6e\") " pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.810836 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.820427 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-656jw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.827181 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lsmgs" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.852794 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m4fw\" (UniqueName: \"kubernetes.io/projected/ade6bc35-f568-4e61-b273-dbc590e64141-kube-api-access-8m4fw\") pod \"catalog-operator-68c6474976-4f82f\" (UID: \"ade6bc35-f568-4e61-b273-dbc590e64141\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.870860 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtnnb\" (UniqueName: \"kubernetes.io/projected/38703e61-9d3b-4d8d-aae8-9740c0948ceb-kube-api-access-qtnnb\") pod \"marketplace-operator-79b997595-zvmr2\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.878606 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.880350 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.380329217 +0000 UTC m=+136.579831347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.932585 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dpgkt"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.933680 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z7r6d"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.964412 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.979810 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.980931 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:42 crc kubenswrapper[4553]: E0930 19:34:42.981359 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.481345302 +0000 UTC m=+136.680847432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.996593 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b"] Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.998097 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" Sep 30 19:34:42 crc kubenswrapper[4553]: I0930 19:34:42.998763 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.004415 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.010608 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp"] Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.022548 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rddkb"] Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.038827 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pncrd"] Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.039982 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nft69"] Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.057838 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:43 crc kubenswrapper[4553]: W0930 19:34:43.065085 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65a9561_f1e7_48f5_ab37_4c59699c0b6f.slice/crio-911956a08413abd08da61f8199fdb9735117ca985bfe48b5dfb98f32a958b6bb WatchSource:0}: Error finding container 911956a08413abd08da61f8199fdb9735117ca985bfe48b5dfb98f32a958b6bb: Status 404 returned error can't find the container with id 911956a08413abd08da61f8199fdb9735117ca985bfe48b5dfb98f32a958b6bb Sep 30 19:34:43 crc kubenswrapper[4553]: W0930 19:34:43.077597 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd13fb01_b6ec_486e_8b39_7440a349ae64.slice/crio-1da73946f2c48091c452ec2b98cf3d9da50c973e549e29b7d43090d2326f0469 WatchSource:0}: Error finding container 1da73946f2c48091c452ec2b98cf3d9da50c973e549e29b7d43090d2326f0469: Status 404 returned error can't find the container with id 1da73946f2c48091c452ec2b98cf3d9da50c973e549e29b7d43090d2326f0469 Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.082005 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.082169 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.582141852 +0000 UTC m=+136.781643982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.083319 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.083793 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.583775947 +0000 UTC m=+136.783278077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: W0930 19:34:43.131235 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70ef651c_3015_42e3_9d1f_df3833f6343a.slice/crio-6f1908ae5d2f8369e250de4f09fd58bc3cf059d5a04707ef0dc0db5bb9e2cdd9 WatchSource:0}: Error finding container 6f1908ae5d2f8369e250de4f09fd58bc3cf059d5a04707ef0dc0db5bb9e2cdd9: Status 404 returned error can't find the container with id 6f1908ae5d2f8369e250de4f09fd58bc3cf059d5a04707ef0dc0db5bb9e2cdd9 Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.186761 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.187248 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.687220477 +0000 UTC m=+136.886722607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.187499 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.212493 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.712459503 +0000 UTC m=+136.911961633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.290356 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.290823 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.790802901 +0000 UTC m=+136.990305031 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.381910 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5"] Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.392595 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.392982 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.892966828 +0000 UTC m=+137.092468958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.424965 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" event={"ID":"5c8d22af-7187-427b-8a5c-24e49c3e96cc","Type":"ContainerStarted","Data":"f6ae641ef0f8a03f88bea9fbdca709962a12417f7dc12c5de892aec1ae044629"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.436481 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" event={"ID":"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc","Type":"ContainerStarted","Data":"28db9681f6ee335021f42398474e8a7bedfc19b581d2548504c55f16106e5618"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.449822 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" event={"ID":"323f9188-3789-4e7c-b4d2-17f051188a15","Type":"ContainerStarted","Data":"0383261f1ee3ce21e3f490c72e8daa8505abf8b3517fe92786dd2e2b7897d117"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.479204 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" event={"ID":"fd13fb01-b6ec-486e-8b39-7440a349ae64","Type":"ContainerStarted","Data":"1da73946f2c48091c452ec2b98cf3d9da50c973e549e29b7d43090d2326f0469"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.480582 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" event={"ID":"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2","Type":"ContainerStarted","Data":"ef1718af691ebe27931ba801bdbfbcebae7113f6dfec5ad1cd6e3244ea6c2aac"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.481701 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-j2fv9" event={"ID":"76d4f83a-1f82-4374-bc4d-601f752d318d","Type":"ContainerStarted","Data":"54948657426ceead5a307ba3b796b16739eb00b5362c802bb88e743cc6cb0b76"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.485197 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" event={"ID":"f65a9561-f1e7-48f5-ab37-4c59699c0b6f","Type":"ContainerStarted","Data":"911956a08413abd08da61f8199fdb9735117ca985bfe48b5dfb98f32a958b6bb"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.489131 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" event={"ID":"d1525a24-eab3-489e-b3c2-2ab74a2c5a60","Type":"ContainerStarted","Data":"061e2744371f37241da7d62fc136b263068bd16905d0944e41f1cfc142b41c7f"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.493319 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.494102 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:43.994081407 +0000 UTC m=+137.193583537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.495210 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" event={"ID":"06f52f14-b54f-4666-9413-e299c6ad0f22","Type":"ContainerStarted","Data":"620bc8c77a099ece9053ad94dd39de15ee2176ff23c53207c535121c7942b3e3"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.496484 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" event={"ID":"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3","Type":"ContainerStarted","Data":"5aeffe9eec8bb2d17c85e3defe60d5c55491d98425b450cdb800d42f847279b7"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.501627 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" event={"ID":"be6f13fb-81da-4176-8540-a3fa61cd7002","Type":"ContainerStarted","Data":"96ab30cb7cc39104f5d32fdac4ae1b3f84ec3d88387175c7c38d4ef60144c9d1"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.502406 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6csmn" event={"ID":"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74","Type":"ContainerStarted","Data":"2d851c626bb46043b553592433c219a24a04a7d07d41202b56bda105435b794d"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.512448 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" event={"ID":"b23246e9-901f-436d-b8c4-d9ffc47dc3a7","Type":"ContainerStarted","Data":"ba7341ec385534012dcb4d7b33ab83561e8758bb3c8eb0a4bfcbfcc71f2307cb"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.512922 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" event={"ID":"70ef651c-3015-42e3-9d1f-df3833f6343a","Type":"ContainerStarted","Data":"6f1908ae5d2f8369e250de4f09fd58bc3cf059d5a04707ef0dc0db5bb9e2cdd9"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.517200 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" event={"ID":"e38ed88a-7879-4394-b102-aa5ad331aa5e","Type":"ContainerStarted","Data":"ac9954c2da9f88f1527230c5fc7896b15cd78dd531f7e68f4e211eaae47b8cb2"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.518557 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" event={"ID":"a88973e4-6669-4d85-89b6-2d287df271ea","Type":"ContainerStarted","Data":"87ebec159df8a81819870231f82820b2187c3c35c20d2a479edafa593eb4ce54"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.518582 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" event={"ID":"a88973e4-6669-4d85-89b6-2d287df271ea","Type":"ContainerStarted","Data":"4582e5dbfeab1c9e7b6343b8a0d7c7d65c04370fa0dfedc7f38d71604582502f"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.522342 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" event={"ID":"c07734d2-f320-4fa6-b259-39862951b066","Type":"ContainerStarted","Data":"ef668243555175f1c9f062c8b01d2fea119fd60cfb22076e290ab90317515d6c"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.534527 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" event={"ID":"d29d9e48-584a-4c95-a9b7-1039a800071e","Type":"ContainerStarted","Data":"9764ed8908de786ddaa64d7602cf6f3908df756497cb4843a9f056c5e9469400"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.538652 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" event={"ID":"96bf498c-034c-431c-ae07-4099724a48a7","Type":"ContainerStarted","Data":"5eaf684c2642f20caf01e6c5e97e920442e6e27fde9d0f73247cf38d2677083e"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.543953 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx"] Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.545274 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r22xf" event={"ID":"87431bdf-f949-4c35-916f-e14903939fe1","Type":"ContainerStarted","Data":"76a741bf0d04bd74e5e11f898392f4e19201b051a1b4655ea604c5ef7839e202"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.549171 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" event={"ID":"77393764-0c2d-4822-8558-71f98dbaef2f","Type":"ContainerStarted","Data":"e4a2a3211a1c067abcd6b6d3a341e52cafd23859d2105f4e520f36df0710c72a"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.549996 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" event={"ID":"0e392aad-9ae5-4942-a078-8ef9cbaffb90","Type":"ContainerStarted","Data":"974e3056057203644dfd96178b3c5437e4dce3661ab2b48d1166d6a4adfe096b"} Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.595436 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.596030 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.096003856 +0000 UTC m=+137.295505986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.697240 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.697494 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.197455984 +0000 UTC m=+137.396958104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.697988 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.701381 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.201359009 +0000 UTC m=+137.400861139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.726509 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z"] Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.742486 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-656jw"] Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.816372 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.826684 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.326646024 +0000 UTC m=+137.526148144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:43 crc kubenswrapper[4553]: I0930 19:34:43.930123 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:43 crc kubenswrapper[4553]: E0930 19:34:43.930570 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.430557587 +0000 UTC m=+137.630059717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.033099 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.033499 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.533461274 +0000 UTC m=+137.732963404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.033944 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.034477 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.534459981 +0000 UTC m=+137.733962111 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.044324 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvmr2"] Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.061138 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.077619 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.077696 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.119437 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f"] Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.135920 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.136258 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.636242227 +0000 UTC m=+137.835744357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.249848 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.250200 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.750187669 +0000 UTC m=+137.949689799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.281318 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm"] Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.283403 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4lrff" podStartSLOduration=116.283386478 podStartE2EDuration="1m56.283386478s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.269506086 +0000 UTC m=+137.469008216" watchObservedRunningTime="2025-09-30 19:34:44.283386478 +0000 UTC m=+137.482888598" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.317221 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r22xf" podStartSLOduration=116.317199913 podStartE2EDuration="1m56.317199913s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.316458864 +0000 UTC m=+137.515960994" watchObservedRunningTime="2025-09-30 19:34:44.317199913 +0000 UTC m=+137.516702043" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.358795 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf"] Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.359572 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.359998 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.85997578 +0000 UTC m=+138.059477900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.468743 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.469685 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:44.969672827 +0000 UTC m=+138.169174947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.558147 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" event={"ID":"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8","Type":"ContainerStarted","Data":"90b7eb25e70e6113de7547b01a473aaad19127230a4e06096732ce0d18b17f34"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.560194 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-j2fv9" event={"ID":"76d4f83a-1f82-4374-bc4d-601f752d318d","Type":"ContainerStarted","Data":"bca45515c046ff41706b9f3fb3aa4c2d92001e669071794e21cb02241f2a9677"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.562228 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-j2fv9" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.569933 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.571567 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.071548066 +0000 UTC m=+138.271050196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.571671 4553 patch_prober.go:28] interesting pod/downloads-7954f5f757-j2fv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.571723 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j2fv9" podUID="76d4f83a-1f82-4374-bc4d-601f752d318d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.598555 4553 generic.go:334] "Generic (PLEG): container finished" podID="19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc" containerID="177ed20491be343db794e8c0bace7a5046b58297f63c08a9c5537b83fa9c0ec6" exitCode=0 Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.598699 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" event={"ID":"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc","Type":"ContainerDied","Data":"177ed20491be343db794e8c0bace7a5046b58297f63c08a9c5537b83fa9c0ec6"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.600634 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-j2fv9" podStartSLOduration=116.600606115 podStartE2EDuration="1m56.600606115s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.596166236 +0000 UTC m=+137.795668366" watchObservedRunningTime="2025-09-30 19:34:44.600606115 +0000 UTC m=+137.800108245" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.624484 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6csmn" event={"ID":"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74","Type":"ContainerStarted","Data":"d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.635675 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" event={"ID":"96bf498c-034c-431c-ae07-4099724a48a7","Type":"ContainerStarted","Data":"729718dc8f3535ff2e09a156c69f54432338cc37d1105fae6534c61833203332"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.636543 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.640232 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-656jw" event={"ID":"8efd871e-42ac-408d-9a1f-3635cb099a4b","Type":"ContainerStarted","Data":"624830a1d07487e6c0149b0e0c381fa1b2ffe9e98b8c555a59470a068941ab35"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.655138 4553 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5dq4n container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.655214 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" podUID="96bf498c-034c-431c-ae07-4099724a48a7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.662454 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" podStartSLOduration=116.662430851 podStartE2EDuration="1m56.662430851s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.662142463 +0000 UTC m=+137.861644583" watchObservedRunningTime="2025-09-30 19:34:44.662430851 +0000 UTC m=+137.861932981" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.668325 4553 generic.go:334] "Generic (PLEG): container finished" podID="be6f13fb-81da-4176-8540-a3fa61cd7002" containerID="8248bdc181bd17aa32ae44cff70355ff1880148d87a3bfa10fba27b51d138c41" exitCode=0 Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.668867 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" event={"ID":"be6f13fb-81da-4176-8540-a3fa61cd7002","Type":"ContainerDied","Data":"8248bdc181bd17aa32ae44cff70355ff1880148d87a3bfa10fba27b51d138c41"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.671573 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.684808 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.184781469 +0000 UTC m=+138.384283599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.695004 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-6csmn" podStartSLOduration=116.694967292 podStartE2EDuration="1m56.694967292s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.692562938 +0000 UTC m=+137.892065068" watchObservedRunningTime="2025-09-30 19:34:44.694967292 +0000 UTC m=+137.894469422" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.707118 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" event={"ID":"06f52f14-b54f-4666-9413-e299c6ad0f22","Type":"ContainerStarted","Data":"13dd217705325e478dc916dc3c04b8f7335d1a3d91fa0db5ca46be45aa35395e"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.709313 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.742961 4553 patch_prober.go:28] interesting pod/console-operator-58897d9998-6bqsb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.743085 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" podUID="06f52f14-b54f-4666-9413-e299c6ad0f22" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.742975 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" event={"ID":"c07734d2-f320-4fa6-b259-39862951b066","Type":"ContainerStarted","Data":"8a1dc6738483cb61f653115a40a9fc2944549eb280b3aa5539516143a9fda448"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.775960 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" event={"ID":"e38ed88a-7879-4394-b102-aa5ad331aa5e","Type":"ContainerStarted","Data":"4ca6624fdca01f1bffcbeb1200c25ceed30e06cfcf77230ccf959e4b239c38f2"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.789536 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.811343 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.311314218 +0000 UTC m=+138.510816348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.835980 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" podStartSLOduration=116.835956749 podStartE2EDuration="1m56.835956749s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.774511423 +0000 UTC m=+137.974013553" watchObservedRunningTime="2025-09-30 19:34:44.835956749 +0000 UTC m=+138.035458879" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.853620 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph"] Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.861529 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" event={"ID":"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3","Type":"ContainerStarted","Data":"430ec297aa2731648b62b620376379bedb450f28b8f9e760bc2096d0ce7fdd37"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.864322 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.868202 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xbl8j" podStartSLOduration=116.868170612 podStartE2EDuration="1m56.868170612s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.804557708 +0000 UTC m=+138.004059838" watchObservedRunningTime="2025-09-30 19:34:44.868170612 +0000 UTC m=+138.067672752" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.879058 4553 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2chmh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.879124 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" podUID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.886225 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw"] Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.896611 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7gg7n" podStartSLOduration=116.896583232 podStartE2EDuration="1m56.896583232s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.879411693 +0000 UTC m=+138.078913823" watchObservedRunningTime="2025-09-30 19:34:44.896583232 +0000 UTC m=+138.096085352" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.900865 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" event={"ID":"b23246e9-901f-436d-b8c4-d9ffc47dc3a7","Type":"ContainerStarted","Data":"14ec8fcb4b3fb1663132e781d80ab89acb48f051f5490693dd022ac9327851b1"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.915347 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" event={"ID":"5f298352-909f-4017-ae2a-ac8deda23167","Type":"ContainerStarted","Data":"9b40c6a9de24ac69bb7ab9620926596b0d7e865a36d061c78fabb079f0aa5173"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.925018 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv"] Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.927198 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" event={"ID":"0e392aad-9ae5-4942-a078-8ef9cbaffb90","Type":"ContainerStarted","Data":"626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.928646 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.929632 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" event={"ID":"0ffff74f-1337-47da-907a-f0e10382509d","Type":"ContainerStarted","Data":"8ce8018552e2beca80293e30943c8049adb29398d79ebbe19a253d8aa3f879c0"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.929949 4553 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6tq2m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.930014 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.934473 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" podStartSLOduration=116.934450056 podStartE2EDuration="1m56.934450056s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.914766319 +0000 UTC m=+138.114268449" watchObservedRunningTime="2025-09-30 19:34:44.934450056 +0000 UTC m=+138.133952186" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.935431 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f"] Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.939390 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgm8h" podStartSLOduration=116.939381229 podStartE2EDuration="1m56.939381229s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.935472674 +0000 UTC m=+138.134974804" watchObservedRunningTime="2025-09-30 19:34:44.939381229 +0000 UTC m=+138.138883359" Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.940893 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:44 crc kubenswrapper[4553]: E0930 19:34:44.947436 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.447415704 +0000 UTC m=+138.646917834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.953316 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" event={"ID":"38703e61-9d3b-4d8d-aae8-9740c0948ceb","Type":"ContainerStarted","Data":"fd32379c8dd6ee77c35fd21304f2575a073102f0707cdf467ebb5f8fc1ff6ff6"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.960168 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" event={"ID":"f416f93e-65d8-49ae-a035-40bf9857525a","Type":"ContainerStarted","Data":"a18039ad95decb590e560acece82d2ebeaea0b08105a41bceae761265d65bada"} Sep 30 19:34:44 crc kubenswrapper[4553]: I0930 19:34:44.967188 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9xzz2" event={"ID":"056396e2-e9e7-4bb6-8be4-221c931d490e","Type":"ContainerStarted","Data":"6188f0e8f46734caa441ff31b5fc0c50459320da93000d092eba4ad5e09cb0ea"} Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.041965 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.042866 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.542150661 +0000 UTC m=+138.741652791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.043084 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.046029 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.546005015 +0000 UTC m=+138.745507145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.060642 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" podStartSLOduration=117.060608186 podStartE2EDuration="1m57.060608186s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:44.967147733 +0000 UTC m=+138.166649883" watchObservedRunningTime="2025-09-30 19:34:45.060608186 +0000 UTC m=+138.260110316" Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.065640 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vm8xl"] Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.074844 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:45 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:45 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:45 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.074947 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.144715 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.144879 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.644852653 +0000 UTC m=+138.844354783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.145012 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.145469 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.645453419 +0000 UTC m=+138.844955549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.165256 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w"] Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.194358 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lsmgs"] Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.206679 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f"] Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.234326 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.245738 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.246705 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.746673359 +0000 UTC m=+138.946175489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.246992 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.247611 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.747601595 +0000 UTC m=+138.947103725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.307795 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46"] Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.330111 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h6f2z"] Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.350385 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.350610 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.850578402 +0000 UTC m=+139.050080532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.453788 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.454135 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:45.954122956 +0000 UTC m=+139.153625086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.554559 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.554784 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.054739181 +0000 UTC m=+139.254241311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.554911 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.555235 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.055221324 +0000 UTC m=+139.254723454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.655611 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.655786 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.155756856 +0000 UTC m=+139.355258996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.655898 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.656295 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.156286331 +0000 UTC m=+139.355788461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.757103 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.757309 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.257274446 +0000 UTC m=+139.456776576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.757961 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.758510 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.258487898 +0000 UTC m=+139.457990228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.862822 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.863247 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.363226803 +0000 UTC m=+139.562728923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.964730 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:45 crc kubenswrapper[4553]: E0930 19:34:45.965325 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.465302517 +0000 UTC m=+139.664804647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.985992 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" event={"ID":"0ffff74f-1337-47da-907a-f0e10382509d","Type":"ContainerStarted","Data":"e7f1e50d9b1b978a8c2db674f12cae304e7107728e65a07d0393edd53b4b20e3"} Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.989279 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" event={"ID":"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8","Type":"ContainerStarted","Data":"58407d7d23cabcc67aeafafddc4ee285bc8d48e761ecddae9f6151f72a494dbc"} Sep 30 19:34:45 crc kubenswrapper[4553]: I0930 19:34:45.995784 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" event={"ID":"fd13fb01-b6ec-486e-8b39-7440a349ae64","Type":"ContainerStarted","Data":"d3e03b4897d36e5cdb0a8d1edf425b9078806f54a80f0f79eb29aca98203506f"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.000492 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" event={"ID":"51d0f112-9b15-4602-b8a6-16e79dfeb4cb","Type":"ContainerStarted","Data":"03085b342c05ce37742e186e4028eb6fed96fa862b7cae395a258f64a7b9e499"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.004233 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" podStartSLOduration=119.004220069 podStartE2EDuration="1m59.004220069s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:46.002163424 +0000 UTC m=+139.201665554" watchObservedRunningTime="2025-09-30 19:34:46.004220069 +0000 UTC m=+139.203722199" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.006722 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" event={"ID":"94c26543-8110-4758-8860-54fe4f1349aa","Type":"ContainerStarted","Data":"94b71a44e2fdb35b4739011f534aeeb5ce34ff8fe652926e8b7b802f720acccc"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.008742 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" event={"ID":"e0786ae0-90bb-446a-96ae-5b522f776e0b","Type":"ContainerStarted","Data":"a9a04c3e32e4a64724920fb9417d88b1e263911a83bf9675d72106f4728a75fa"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.013035 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" event={"ID":"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2","Type":"ContainerStarted","Data":"316b37f45a2a2f20184f26a46d025cb6615d7b7a01f667d9afa1425ba05a2b65"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.015099 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" event={"ID":"ade6bc35-f568-4e61-b273-dbc590e64141","Type":"ContainerStarted","Data":"c9aa8bee2e5a408c923609d8b1a5e580a9845f36e1b7d5aa9c36fa57cedf565f"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.020477 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" event={"ID":"5f298352-909f-4017-ae2a-ac8deda23167","Type":"ContainerStarted","Data":"64780d1075e7069476ded5aca51819748cc6154699bdd764a85fa6392021b55e"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.022750 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" event={"ID":"70ef651c-3015-42e3-9d1f-df3833f6343a","Type":"ContainerStarted","Data":"d94abe50f29e15a53ee32988b34cc1c7e2d229533254fe802ac765d429598bfe"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.025171 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" event={"ID":"5c8d22af-7187-427b-8a5c-24e49c3e96cc","Type":"ContainerStarted","Data":"17e3b9d7348337297e769c3ebed2067fb0fe004da9368925a280deee5c85405c"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.030929 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" event={"ID":"77393764-0c2d-4822-8558-71f98dbaef2f","Type":"ContainerStarted","Data":"e508dc74d24831b865f91bb019a60114684e9a70e7ac74a14411223b2267ccfe"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.032852 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" event={"ID":"f65a9561-f1e7-48f5-ab37-4c59699c0b6f","Type":"ContainerStarted","Data":"b72afe74aee7de1786cdda3e2c4c80d739490d913bf3f1c5ed1addb141443045"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.034008 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" event={"ID":"8cb17eb3-73f0-4235-9bf8-11723b544e6e","Type":"ContainerStarted","Data":"cd76468ce64a2049185f83d267f4ec15b3936bfeb3bcc9ac0755a3d124ebb262"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.035943 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" event={"ID":"0152fc34-26e8-4555-bb44-227eb61394b6","Type":"ContainerStarted","Data":"70487981ac4d254ff1d9a6f1fb09913c8028fd04b2b68b7c24682f34701f0a0c"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.043148 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqs7" podStartSLOduration=118.043125971 podStartE2EDuration="1m58.043125971s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:46.039566367 +0000 UTC m=+139.239068487" watchObservedRunningTime="2025-09-30 19:34:46.043125971 +0000 UTC m=+139.242628101" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.043575 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" event={"ID":"d1525a24-eab3-489e-b3c2-2ab74a2c5a60","Type":"ContainerStarted","Data":"19d09ae3e8e3a65b8ed5e193930b8189900cdca2a43e00a0f7c7451f132bab8a"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.045525 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" event={"ID":"066d34b5-1b98-40a8-98a5-59f7edcc43e4","Type":"ContainerStarted","Data":"1cf761f5ca8188fc66a7116026b37761e9efdec9ad2547d2900638089cec565a"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.047115 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" event={"ID":"5a5f65f6-0be4-40dd-be78-16de7ada0614","Type":"ContainerStarted","Data":"87cc0445fb9d4fb435f68376ef56a1a8e750617dd905cbde6c0205e03c4a3799"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.049365 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" event={"ID":"bf4bad02-1539-46e4-a436-5c4f2c94fda1","Type":"ContainerStarted","Data":"4bc9a1d3fc7a27efbdfeae621e58266eb34eebcd4b222b568a3917dca5f6f09c"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.051173 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" event={"ID":"58c5f2ca-b4da-444a-8ecc-22c758896df9","Type":"ContainerStarted","Data":"5ea3bdf18099e31cf7da92d3c00b60b9083da337aacc605e5e81c469662142a0"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.056334 4553 generic.go:334] "Generic (PLEG): container finished" podID="323f9188-3789-4e7c-b4d2-17f051188a15" containerID="9ef9813d045b9fc193b5f7b8d8844d0e4b8c87a99a5f505dd1f1f7750e4fa5be" exitCode=0 Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.056415 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" event={"ID":"323f9188-3789-4e7c-b4d2-17f051188a15","Type":"ContainerDied","Data":"9ef9813d045b9fc193b5f7b8d8844d0e4b8c87a99a5f505dd1f1f7750e4fa5be"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.064644 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lsmgs" event={"ID":"9202a914-8e69-4d7e-9337-eff52d5c4bd6","Type":"ContainerStarted","Data":"2b648b7f0ce9076f3b3679c390f5f22e335c19b7496869e170924c5a575bb211"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.065647 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.065797 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.565766477 +0000 UTC m=+139.765268607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.066054 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.066501 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.566486197 +0000 UTC m=+139.765988327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.068653 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" event={"ID":"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4","Type":"ContainerStarted","Data":"950679381eaf402b456f29ec6df0894ec885025d0642d608e1e8253e76caf03d"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.070868 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:46 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:46 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:46 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.070912 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.071764 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" event={"ID":"d29d9e48-584a-4c95-a9b7-1039a800071e","Type":"ContainerStarted","Data":"b6d700ad66c4162fa9beac2876104350101511f8c22dcba36eac48249df8de01"} Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.072902 4553 patch_prober.go:28] interesting pod/downloads-7954f5f757-j2fv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.072935 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j2fv9" podUID="76d4f83a-1f82-4374-bc4d-601f752d318d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.072902 4553 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6tq2m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.073262 4553 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5dq4n container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.073290 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.073297 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" podUID="96bf498c-034c-431c-ae07-4099724a48a7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.073344 4553 patch_prober.go:28] interesting pod/console-operator-58897d9998-6bqsb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.073423 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" podUID="06f52f14-b54f-4666-9413-e299c6ad0f22" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.075713 4553 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2chmh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.075761 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" podUID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.103868 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nft69" podStartSLOduration=118.103850967 podStartE2EDuration="1m58.103850967s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:46.102725568 +0000 UTC m=+139.302227698" watchObservedRunningTime="2025-09-30 19:34:46.103850967 +0000 UTC m=+139.303353097" Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.167586 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.167829 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.667807771 +0000 UTC m=+139.867309901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.170567 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.176134 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.676116644 +0000 UTC m=+139.875618774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.280130 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.280514 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.78049067 +0000 UTC m=+139.979992800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.286429 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.286996 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.786982743 +0000 UTC m=+139.986484873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.387570 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.388138 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.888119022 +0000 UTC m=+140.087621152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.489434 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.489971 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:46.989941739 +0000 UTC m=+140.189444019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.590718 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.591144 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.09112578 +0000 UTC m=+140.290627910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.692853 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.693389 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.193369648 +0000 UTC m=+140.392871778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.794938 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.795280 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.295263057 +0000 UTC m=+140.494765187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:46 crc kubenswrapper[4553]: I0930 19:34:46.899275 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:46 crc kubenswrapper[4553]: E0930 19:34:46.899825 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.399806177 +0000 UTC m=+140.599308307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.001603 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.001960 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.501943143 +0000 UTC m=+140.701445273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.064766 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:47 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:47 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:47 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.065355 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.077238 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" event={"ID":"be6f13fb-81da-4176-8540-a3fa61cd7002","Type":"ContainerStarted","Data":"1b1413918317b0458ac7ae4fa6ebeeda83d0456707710034ce1c1ef8323b8a75"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.079089 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" event={"ID":"f65a9561-f1e7-48f5-ab37-4c59699c0b6f","Type":"ContainerStarted","Data":"ac369753ea746362916d336de82ce1ae8d3617d2416bdbc215e5800810385437"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.080126 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" event={"ID":"0152fc34-26e8-4555-bb44-227eb61394b6","Type":"ContainerStarted","Data":"f581b78a7ed0beb26af5148f386cb2a9f751b730acda12c7d34f9431eac27c8c"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.081407 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" event={"ID":"fd13fb01-b6ec-486e-8b39-7440a349ae64","Type":"ContainerStarted","Data":"5e3836822b04ef9e15914d44b3b0a799b348d3ab8c105f0d091112f33ef30f8a"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.082539 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" event={"ID":"5a5f65f6-0be4-40dd-be78-16de7ada0614","Type":"ContainerStarted","Data":"1802992b315d8d237c154367bbd2ed519fca299b90afffa39e74922f04911cba"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.084162 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" event={"ID":"19cbd0c5-42cd-416e-bfd5-bb5c974bb2fc","Type":"ContainerStarted","Data":"6832792bcc1be1ceb64bdb720c1d141f1487b52d1b33a07faacdecb2d8a68d69"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.085299 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" event={"ID":"e0786ae0-90bb-446a-96ae-5b522f776e0b","Type":"ContainerStarted","Data":"1dbaa7ddb200c88d391780537955d4ab104b15c92a502bfc644af5e850340f08"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.086902 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" event={"ID":"70ef651c-3015-42e3-9d1f-df3833f6343a","Type":"ContainerStarted","Data":"936891daa511e271bf2be2e373c7255cb16655c2c3c00dc39a8045d91e173c8d"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.088226 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" event={"ID":"cf81a909-a6ef-46c2-9d8f-ec12c2c748f4","Type":"ContainerStarted","Data":"3c4806824bbf92486df03ef06e287c908b64b385288884204539d1c7d4a15180"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.089553 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" event={"ID":"f416f93e-65d8-49ae-a035-40bf9857525a","Type":"ContainerStarted","Data":"0e1178bb1443edce0a6c6bb8cb8a881d1f06522077833f1e8e4f8d99f3868e7b"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.091126 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" event={"ID":"323f9188-3789-4e7c-b4d2-17f051188a15","Type":"ContainerStarted","Data":"85b4850e2140125f356bde3ad18a4a4ee5a09282a2f345f8991169b48aacb076"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.095558 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" event={"ID":"94c26543-8110-4758-8860-54fe4f1349aa","Type":"ContainerStarted","Data":"fe9b2add31e225664ef3aab7fd2df00ef1f3ee8b30d33e65c0ce57680d8e07dd"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.096695 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" event={"ID":"38703e61-9d3b-4d8d-aae8-9740c0948ceb","Type":"ContainerStarted","Data":"c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.097878 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-656jw" event={"ID":"8efd871e-42ac-408d-9a1f-3635cb099a4b","Type":"ContainerStarted","Data":"9414e99c66ba14949883d9d93ee6833bd3164e04e0517b17634029628aa11ad0"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.100805 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9xzz2" event={"ID":"056396e2-e9e7-4bb6-8be4-221c931d490e","Type":"ContainerStarted","Data":"53fd5f870105338316d51ac3d6c3538dbaf77c3453f89c60154036a5c6e5a894"} Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105235 4553 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6tq2m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105282 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105334 4553 patch_prober.go:28] interesting pod/console-operator-58897d9998-6bqsb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105393 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" podUID="06f52f14-b54f-4666-9413-e299c6ad0f22" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105481 4553 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5dq4n container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105496 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" podUID="96bf498c-034c-431c-ae07-4099724a48a7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105554 4553 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2chmh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105567 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" podUID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105624 4553 patch_prober.go:28] interesting pod/downloads-7954f5f757-j2fv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105639 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j2fv9" podUID="76d4f83a-1f82-4374-bc4d-601f752d318d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.105642 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.106251 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.606228276 +0000 UTC m=+140.805730406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.124933 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gf6g9" podStartSLOduration=120.124911797 podStartE2EDuration="2m0.124911797s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:47.120629283 +0000 UTC m=+140.320131413" watchObservedRunningTime="2025-09-30 19:34:47.124911797 +0000 UTC m=+140.324413927" Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.206328 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.206629 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.706603945 +0000 UTC m=+140.906106075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.207300 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.209206 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.709186204 +0000 UTC m=+140.908688334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.310770 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.311461 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.811440753 +0000 UTC m=+141.010942883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.412101 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.412750 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:47.912723246 +0000 UTC m=+141.112225376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.517646 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.518316 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.018288504 +0000 UTC m=+141.217790634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.619791 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.620237 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.120224224 +0000 UTC m=+141.319726344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.721068 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.721481 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.221463466 +0000 UTC m=+141.420965596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.824525 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.825052 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.325011769 +0000 UTC m=+141.524513899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.926735 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.926926 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.426883188 +0000 UTC m=+141.626385318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:47 crc kubenswrapper[4553]: I0930 19:34:47.927061 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:47 crc kubenswrapper[4553]: E0930 19:34:47.927471 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.427463063 +0000 UTC m=+141.626965193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.028761 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.028954 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.52891787 +0000 UTC m=+141.728420000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.029268 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.029686 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.529674071 +0000 UTC m=+141.729176401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.069710 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:48 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:48 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:48 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.069817 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.108450 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" event={"ID":"59c0a5ab-f3fc-4842-b79e-8d46d0c479b2","Type":"ContainerStarted","Data":"4251733b1345af5447615ec29fa291c76fc8496fe94362209751b9455fd29165"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.110904 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" event={"ID":"94c26543-8110-4758-8860-54fe4f1349aa","Type":"ContainerStarted","Data":"ef4d98651d720238ecdd58769ccffb19229f71cf20e140fcf6f2253b0dfbe8a0"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.111132 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.112376 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" event={"ID":"066d34b5-1b98-40a8-98a5-59f7edcc43e4","Type":"ContainerStarted","Data":"c74b4da938d46c457577c5a7519e6a478371cc418141fc8e4ef0772e338e0cf2"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.113582 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" event={"ID":"ade6bc35-f568-4e61-b273-dbc590e64141","Type":"ContainerStarted","Data":"44c4dac20f7e2a4092d84ae937cc33a4b90bf10087c2a997565a01a51a9b4359"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.113654 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.115633 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" event={"ID":"5f298352-909f-4017-ae2a-ac8deda23167","Type":"ContainerStarted","Data":"5d4ade04168ceb204f352a53a5962419a683bb362be32d726f20fd48615e2695"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.115969 4553 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4f82f container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.116023 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" podUID="ade6bc35-f568-4e61-b273-dbc590e64141" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.117583 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" event={"ID":"bf4bad02-1539-46e4-a436-5c4f2c94fda1","Type":"ContainerStarted","Data":"1083585b5d5447e408d14e4665018666e46159e8e11ffee1947523b6508692dd"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.117698 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" event={"ID":"bf4bad02-1539-46e4-a436-5c4f2c94fda1","Type":"ContainerStarted","Data":"d6cf2ae789c26e8c9da969570f3aed6b1de3b1341c69b85abd4f1539e5b39bce"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.120396 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" event={"ID":"58c5f2ca-b4da-444a-8ecc-22c758896df9","Type":"ContainerStarted","Data":"1f497a3132fc7a7a52c30114d1d14ede61f80408b169328ba67e621625ccef0d"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.125298 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-656jw" event={"ID":"8efd871e-42ac-408d-9a1f-3635cb099a4b","Type":"ContainerStarted","Data":"a923266ea03df3e652915920cc84239188899f783dbbe2315d4603aea69bbb59"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.125460 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-656jw" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.128078 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" event={"ID":"51d0f112-9b15-4602-b8a6-16e79dfeb4cb","Type":"ContainerStarted","Data":"20f109f93d99b1f8dc817d5e00f15fd7e6cf3b05deaa76d67968c881b7767ca4"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.129764 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.129917 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.629888256 +0000 UTC m=+141.829390386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.130136 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.130508 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.630492041 +0000 UTC m=+141.829994171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.130906 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lsmgs" event={"ID":"9202a914-8e69-4d7e-9337-eff52d5c4bd6","Type":"ContainerStarted","Data":"6a539c24b199b8bda1f496eaba877f43bfec6cf53b86bbb0ce7904e95d24f85e"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.132963 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" event={"ID":"d1525a24-eab3-489e-b3c2-2ab74a2c5a60","Type":"ContainerStarted","Data":"f6ba838bbd78f28a44e8e0faf30a785309a0fce3dc7105e8b21d6e11c99f7e05"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.137359 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" event={"ID":"be6f13fb-81da-4176-8540-a3fa61cd7002","Type":"ContainerStarted","Data":"50d54498b9e98e4944c8d54192fba3ede4f1f2429597dde6e09a2fc8079b1123"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.139670 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" event={"ID":"ff6a3485-5e32-4bbc-8548-c22eb6b5b6c8","Type":"ContainerStarted","Data":"20028dc9a256946a6e2b9fbce063f938599b3f5846e69c1a3f682433c548b9d7"} Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.141587 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.196621 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" podStartSLOduration=120.196573261 podStartE2EDuration="2m0.196573261s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.195736679 +0000 UTC m=+141.395238819" watchObservedRunningTime="2025-09-30 19:34:48.196573261 +0000 UTC m=+141.396075391" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.199678 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qg65b" podStartSLOduration=120.199669794 podStartE2EDuration="2m0.199669794s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.157416392 +0000 UTC m=+141.356918532" watchObservedRunningTime="2025-09-30 19:34:48.199669794 +0000 UTC m=+141.399171924" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.231898 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.232154 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.732114854 +0000 UTC m=+141.931616974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.232326 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.233446 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.733427598 +0000 UTC m=+141.932929728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.245851 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" podStartSLOduration=120.24581507 podStartE2EDuration="2m0.24581507s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.240345843 +0000 UTC m=+141.439847973" watchObservedRunningTime="2025-09-30 19:34:48.24581507 +0000 UTC m=+141.445317200" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.284610 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-656jw" podStartSLOduration=9.284594289 podStartE2EDuration="9.284594289s" podCreationTimestamp="2025-09-30 19:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.281012783 +0000 UTC m=+141.480514913" watchObservedRunningTime="2025-09-30 19:34:48.284594289 +0000 UTC m=+141.484096419" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.334075 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.334372 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.834331712 +0000 UTC m=+142.033833852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.335399 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.336795 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.836783387 +0000 UTC m=+142.036285517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.350922 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" podStartSLOduration=120.350892694 podStartE2EDuration="2m0.350892694s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.335113672 +0000 UTC m=+141.534615812" watchObservedRunningTime="2025-09-30 19:34:48.350892694 +0000 UTC m=+141.550394824" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.389501 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9xzz2" podStartSLOduration=9.389481468 podStartE2EDuration="9.389481468s" podCreationTimestamp="2025-09-30 19:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.386434567 +0000 UTC m=+141.585936717" watchObservedRunningTime="2025-09-30 19:34:48.389481468 +0000 UTC m=+141.588983598" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.414406 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-88nf5" podStartSLOduration=120.414385536 podStartE2EDuration="2m0.414385536s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.413006999 +0000 UTC m=+141.612509129" watchObservedRunningTime="2025-09-30 19:34:48.414385536 +0000 UTC m=+141.613887666" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.436819 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.436953 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.936926189 +0000 UTC m=+142.136428319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.437472 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.437790 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:48.937782122 +0000 UTC m=+142.137284252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.483462 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bnsph" podStartSLOduration=120.483443965 podStartE2EDuration="2m0.483443965s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.482447718 +0000 UTC m=+141.681949858" watchObservedRunningTime="2025-09-30 19:34:48.483443965 +0000 UTC m=+141.682946095" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.536060 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" podStartSLOduration=120.536026134 podStartE2EDuration="2m0.536026134s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.535862779 +0000 UTC m=+141.735364909" watchObservedRunningTime="2025-09-30 19:34:48.536026134 +0000 UTC m=+141.735528264" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.539187 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.539531 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.039512687 +0000 UTC m=+142.239014817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.613940 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4zxpw" podStartSLOduration=120.61392154 podStartE2EDuration="2m0.61392154s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.613080568 +0000 UTC m=+141.812582698" watchObservedRunningTime="2025-09-30 19:34:48.61392154 +0000 UTC m=+141.813423670" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.614699 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" podStartSLOduration=120.614695011 podStartE2EDuration="2m0.614695011s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.588689254 +0000 UTC m=+141.788191384" watchObservedRunningTime="2025-09-30 19:34:48.614695011 +0000 UTC m=+141.814197141" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.640280 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.640710 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.140694157 +0000 UTC m=+142.340196287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.660851 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" podStartSLOduration=120.660831377 podStartE2EDuration="2m0.660831377s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.659224534 +0000 UTC m=+141.858726674" watchObservedRunningTime="2025-09-30 19:34:48.660831377 +0000 UTC m=+141.860333497" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.693620 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-dpgkt" podStartSLOduration=120.693591494 podStartE2EDuration="2m0.693591494s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.679365623 +0000 UTC m=+141.878867763" watchObservedRunningTime="2025-09-30 19:34:48.693591494 +0000 UTC m=+141.893093624" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.733650 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qhwcx" podStartSLOduration=120.733634057 podStartE2EDuration="2m0.733634057s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.727665116 +0000 UTC m=+141.927167256" watchObservedRunningTime="2025-09-30 19:34:48.733634057 +0000 UTC m=+141.933136177" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.743738 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.744405 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.244362553 +0000 UTC m=+142.443864683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.781303 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-254cf" podStartSLOduration=120.781270943 podStartE2EDuration="2m0.781270943s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.7793173 +0000 UTC m=+141.978819430" watchObservedRunningTime="2025-09-30 19:34:48.781270943 +0000 UTC m=+141.980773073" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.845023 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.845480 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.345461922 +0000 UTC m=+142.544964052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.848675 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-pncrd" podStartSLOduration=120.848660348 podStartE2EDuration="2m0.848660348s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.847214949 +0000 UTC m=+142.046717079" watchObservedRunningTime="2025-09-30 19:34:48.848660348 +0000 UTC m=+142.048162478" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.911961 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-h6f2z" podStartSLOduration=120.911946603 podStartE2EDuration="2m0.911946603s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.911590863 +0000 UTC m=+142.111092993" watchObservedRunningTime="2025-09-30 19:34:48.911946603 +0000 UTC m=+142.111448733" Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.945980 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:48 crc kubenswrapper[4553]: E0930 19:34:48.946457 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.446422826 +0000 UTC m=+142.645924976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:48 crc kubenswrapper[4553]: I0930 19:34:48.956044 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-z7r6d" podStartSLOduration=120.956003323 podStartE2EDuration="2m0.956003323s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.952288963 +0000 UTC m=+142.151791093" watchObservedRunningTime="2025-09-30 19:34:48.956003323 +0000 UTC m=+142.155505453" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.004192 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-sxdfp" podStartSLOduration=121.004175613 podStartE2EDuration="2m1.004175613s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:48.994661538 +0000 UTC m=+142.194163668" watchObservedRunningTime="2025-09-30 19:34:49.004175613 +0000 UTC m=+142.203677743" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.046897 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.047227 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.547213426 +0000 UTC m=+142.746715556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.066785 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:49 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:49 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:49 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.067205 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.071649 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jbl46" podStartSLOduration=121.07163695 podStartE2EDuration="2m1.07163695s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:49.070281433 +0000 UTC m=+142.269783563" watchObservedRunningTime="2025-09-30 19:34:49.07163695 +0000 UTC m=+142.271139080" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.147228 4553 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4f82f container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.147272 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" podUID="ade6bc35-f568-4e61-b273-dbc590e64141" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.149169 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.149448 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.649433654 +0000 UTC m=+142.848935784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.152720 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.154317 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.654297884 +0000 UTC m=+142.853800014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.182948 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bz92f" podStartSLOduration=121.182927271 podStartE2EDuration="2m1.182927271s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:49.115895466 +0000 UTC m=+142.315397596" watchObservedRunningTime="2025-09-30 19:34:49.182927271 +0000 UTC m=+142.382429401" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.183472 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xz69w" podStartSLOduration=121.183466685 podStartE2EDuration="2m1.183466685s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:49.182111109 +0000 UTC m=+142.381613239" watchObservedRunningTime="2025-09-30 19:34:49.183466685 +0000 UTC m=+142.382968815" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.215905 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" podStartSLOduration=121.215873573 podStartE2EDuration="2m1.215873573s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:49.214928868 +0000 UTC m=+142.414430998" watchObservedRunningTime="2025-09-30 19:34:49.215873573 +0000 UTC m=+142.415375703" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.255202 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.255621 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.755586977 +0000 UTC m=+142.955089107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.255762 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.257233 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.757214881 +0000 UTC m=+142.956717011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.265237 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" podStartSLOduration=121.265216145 podStartE2EDuration="2m1.265216145s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:49.26315151 +0000 UTC m=+142.462653640" watchObservedRunningTime="2025-09-30 19:34:49.265216145 +0000 UTC m=+142.464718275" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.289288 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lsmgs" podStartSLOduration=10.28926428 podStartE2EDuration="10.28926428s" podCreationTimestamp="2025-09-30 19:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:49.287927554 +0000 UTC m=+142.487429694" watchObservedRunningTime="2025-09-30 19:34:49.28926428 +0000 UTC m=+142.488766410" Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.357745 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.357944 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.857913638 +0000 UTC m=+143.057415758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.358021 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.358500 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.858483973 +0000 UTC m=+143.057986103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.459307 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.459475 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.959447257 +0000 UTC m=+143.158949387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.459541 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.459976 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:49.959959671 +0000 UTC m=+143.159461811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.560507 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.561284 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.061266585 +0000 UTC m=+143.260768715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.662998 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.663418 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.16339411 +0000 UTC m=+143.362896240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.763756 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.764031 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.264011715 +0000 UTC m=+143.463513845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.865684 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.866204 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.366181082 +0000 UTC m=+143.565683212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:49 crc kubenswrapper[4553]: I0930 19:34:49.966764 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:49 crc kubenswrapper[4553]: E0930 19:34:49.967068 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.467049004 +0000 UTC m=+143.666551134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.064287 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:50 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:50 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:50 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.064341 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.068309 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.068595 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.568584703 +0000 UTC m=+143.768086833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.151956 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" event={"ID":"8cb17eb3-73f0-4235-9bf8-11723b544e6e","Type":"ContainerStarted","Data":"07a0978ee08e3fc0d18fde5947de9917970c4caec8e5c80399d10a7736f6c0c5"} Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.172111 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.172304 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.67227552 +0000 UTC m=+143.871777650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.172734 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.173014 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.67300261 +0000 UTC m=+143.872504740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.274073 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.274195 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.774161669 +0000 UTC m=+143.973663809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.274417 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.274746 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.774736664 +0000 UTC m=+143.974238784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.375243 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.375430 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.875403671 +0000 UTC m=+144.074905801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.375561 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.375871 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.875863284 +0000 UTC m=+144.075365414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.476301 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.476570 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:50.976553131 +0000 UTC m=+144.176055261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.577240 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.577654 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.077637548 +0000 UTC m=+144.277139678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.678667 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.678772 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.178754157 +0000 UTC m=+144.378256287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.678977 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.679385 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.179367603 +0000 UTC m=+144.378869733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.779658 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.779993 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.279956417 +0000 UTC m=+144.479458547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.881064 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.881429 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.381416305 +0000 UTC m=+144.580918435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.982440 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.982788 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.482752139 +0000 UTC m=+144.682254269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:50 crc kubenswrapper[4553]: I0930 19:34:50.982993 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:50 crc kubenswrapper[4553]: E0930 19:34:50.983365 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.483351015 +0000 UTC m=+144.682853145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.067753 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:51 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:51 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:51 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.067807 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.084115 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.084254 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.584238248 +0000 UTC m=+144.783740378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.084373 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.084682 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.584674899 +0000 UTC m=+144.784177029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.185648 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.185777 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.685743106 +0000 UTC m=+144.885245236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.185967 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.186320 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.686311731 +0000 UTC m=+144.885813861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.287391 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.287676 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.787659086 +0000 UTC m=+144.987161216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.388447 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.388837 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.888824706 +0000 UTC m=+145.088326836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.489222 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.489423 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.989313558 +0000 UTC m=+145.188815678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.489525 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.489895 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:51.989885663 +0000 UTC m=+145.189387793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.590568 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.591076 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.091050233 +0000 UTC m=+145.290552373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.648618 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.648681 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.650193 4553 patch_prober.go:28] interesting pod/console-f9d7485db-6csmn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.650254 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6csmn" podUID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.672742 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6bqsb" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.691601 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.692521 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.192472949 +0000 UTC m=+145.391975079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.712698 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.713120 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.748745 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.786955 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.787125 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.792929 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.793088 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.293057514 +0000 UTC m=+145.492559644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.793679 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.794126 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.294109932 +0000 UTC m=+145.493612062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.798599 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.818015 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.841994 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.896129 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:51 crc kubenswrapper[4553]: E0930 19:34:51.896878 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.396830063 +0000 UTC m=+145.596332193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.969612 4553 patch_prober.go:28] interesting pod/downloads-7954f5f757-j2fv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.969682 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j2fv9" podUID="76d4f83a-1f82-4374-bc4d-601f752d318d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.970189 4553 patch_prober.go:28] interesting pod/downloads-7954f5f757-j2fv9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Sep 30 19:34:51 crc kubenswrapper[4553]: I0930 19:34:51.970267 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-j2fv9" podUID="76d4f83a-1f82-4374-bc4d-601f752d318d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:51.998299 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:51.998677 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.498661321 +0000 UTC m=+145.698163451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.003606 4553 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-rddkb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.003653 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" podUID="323f9188-3789-4e7c-b4d2-17f051188a15" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.004931 4553 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-rddkb container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.004996 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" podUID="323f9188-3789-4e7c-b4d2-17f051188a15" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.061085 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.067362 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:52 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:52 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:52 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.067409 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.098628 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.099194 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.599165752 +0000 UTC m=+145.798667882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.171315 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" event={"ID":"8cb17eb3-73f0-4235-9bf8-11723b544e6e","Type":"ContainerStarted","Data":"b216724ff0bf7879d646eec29bf539470ffe93faa1f0b73c0f24e804c6d7217a"} Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.181200 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vlpg8" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.200503 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.201492 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.701481063 +0000 UTC m=+145.900983193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.301266 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.301633 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.801613075 +0000 UTC m=+146.001115205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.402504 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.402833 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:52.902817196 +0000 UTC m=+146.102319326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.489987 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.491067 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.498609 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.498959 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.504140 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.00411892 +0000 UTC m=+146.203621050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.504030 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.504330 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b76da4d2-c7b8-4b2d-9049-09054319874f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b76da4d2-c7b8-4b2d-9049-09054319874f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.504383 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.504435 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b76da4d2-c7b8-4b2d-9049-09054319874f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b76da4d2-c7b8-4b2d-9049-09054319874f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.505010 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.005001823 +0000 UTC m=+146.204503953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.527885 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.594718 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ln7lj"] Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.595633 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.602515 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.607396 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.607596 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b76da4d2-c7b8-4b2d-9049-09054319874f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b76da4d2-c7b8-4b2d-9049-09054319874f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.607672 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b76da4d2-c7b8-4b2d-9049-09054319874f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b76da4d2-c7b8-4b2d-9049-09054319874f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.607901 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.107885999 +0000 UTC m=+146.307388129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.608158 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b76da4d2-c7b8-4b2d-9049-09054319874f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b76da4d2-c7b8-4b2d-9049-09054319874f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.681131 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b76da4d2-c7b8-4b2d-9049-09054319874f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b76da4d2-c7b8-4b2d-9049-09054319874f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.691659 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ln7lj"] Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.709682 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.709750 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-utilities\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.709768 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hllp2\" (UniqueName: \"kubernetes.io/projected/48844b8a-7077-4916-8e11-d21992f206e0-kube-api-access-hllp2\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.709804 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-catalog-content\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.710146 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.210133878 +0000 UTC m=+146.409636008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.750018 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.771816 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cscrv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.778919 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.780434 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nh4pv"] Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.781269 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: W0930 19:34:52.787588 4553 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.787625 4553 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.810261 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.810527 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.810973 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.310953048 +0000 UTC m=+146.510455178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.811028 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-utilities\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.811116 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hllp2\" (UniqueName: \"kubernetes.io/projected/48844b8a-7077-4916-8e11-d21992f206e0-kube-api-access-hllp2\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.811345 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwlq5\" (UniqueName: \"kubernetes.io/projected/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-kube-api-access-xwlq5\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.811424 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-catalog-content\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.811512 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-catalog-content\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.811548 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-utilities\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.812236 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.812486 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.312474949 +0000 UTC m=+146.511977079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.813305 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-utilities\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.813802 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-catalog-content\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.875217 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nh4pv"] Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.877767 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hllp2\" (UniqueName: \"kubernetes.io/projected/48844b8a-7077-4916-8e11-d21992f206e0-kube-api-access-hllp2\") pod \"certified-operators-ln7lj\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.913409 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.913598 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwlq5\" (UniqueName: \"kubernetes.io/projected/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-kube-api-access-xwlq5\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.913625 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-catalog-content\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.913647 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-utilities\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.914137 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-utilities\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: E0930 19:34:52.914213 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.414196394 +0000 UTC m=+146.613698514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.914707 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-catalog-content\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.918310 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:34:52 crc kubenswrapper[4553]: I0930 19:34:52.978203 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwlq5\" (UniqueName: \"kubernetes.io/projected/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-kube-api-access-xwlq5\") pod \"community-operators-nh4pv\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.002589 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vprtz"] Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.003545 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.016434 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.016802 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.516786531 +0000 UTC m=+146.716288661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.065226 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.070390 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:53 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:53 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:53 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.070433 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.128621 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.128838 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-catalog-content\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.128902 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-utilities\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.128963 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzjb\" (UniqueName: \"kubernetes.io/projected/75d78589-abdf-4d0e-a6d2-6649c506a9aa-kube-api-access-jwzjb\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.129154 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.62913963 +0000 UTC m=+146.828641760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.141922 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4f82f" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.168470 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.192560 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r2vvz"] Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.198166 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.212019 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" event={"ID":"8cb17eb3-73f0-4235-9bf8-11723b544e6e","Type":"ContainerStarted","Data":"3a6e97c0cfb7dc4e9b9da387f0a152e65910e8a551b455bf51f957ad9fdd2f2b"} Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.212103 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" event={"ID":"8cb17eb3-73f0-4235-9bf8-11723b544e6e","Type":"ContainerStarted","Data":"5335d7e8dd9d716028ba67b0dbae4a43da60621eb06aafbb6f212b3db48432cc"} Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.217070 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vprtz"] Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.231531 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.231578 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-utilities\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.232566 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-utilities\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.232837 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzjb\" (UniqueName: \"kubernetes.io/projected/75d78589-abdf-4d0e-a6d2-6649c506a9aa-kube-api-access-jwzjb\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.233099 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.733082524 +0000 UTC m=+146.932584754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.233691 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-catalog-content\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.235179 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-catalog-content\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.313492 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2vvz"] Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.334496 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.334647 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-catalog-content\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.334729 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnkpp\" (UniqueName: \"kubernetes.io/projected/3be1f54e-bb1a-4ef0-90b8-865875aa543e-kube-api-access-vnkpp\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.334811 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-utilities\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.335113 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.835095337 +0000 UTC m=+147.034597467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.355451 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzjb\" (UniqueName: \"kubernetes.io/projected/75d78589-abdf-4d0e-a6d2-6649c506a9aa-kube-api-access-jwzjb\") pod \"certified-operators-vprtz\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.389309 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8f88f" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.436302 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnkpp\" (UniqueName: \"kubernetes.io/projected/3be1f54e-bb1a-4ef0-90b8-865875aa543e-kube-api-access-vnkpp\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.436362 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.436395 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-utilities\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.436447 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-catalog-content\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.436923 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-catalog-content\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.437760 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:53.937750416 +0000 UTC m=+147.137252546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.437994 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-utilities\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.462018 4553 patch_prober.go:28] interesting pod/apiserver-76f77b778f-djhpv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]log ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]etcd ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/max-in-flight-filter ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 19:34:53 crc kubenswrapper[4553]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 19:34:53 crc kubenswrapper[4553]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 19:34:53 crc kubenswrapper[4553]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 19:34:53 crc kubenswrapper[4553]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 19:34:53 crc kubenswrapper[4553]: livez check failed Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.462092 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" podUID="be6f13fb-81da-4176-8540-a3fa61cd7002" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.504382 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnkpp\" (UniqueName: \"kubernetes.io/projected/3be1f54e-bb1a-4ef0-90b8-865875aa543e-kube-api-access-vnkpp\") pod \"community-operators-r2vvz\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.508698 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vm8xl" podStartSLOduration=14.508678006 podStartE2EDuration="14.508678006s" podCreationTimestamp="2025-09-30 19:34:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:53.438358523 +0000 UTC m=+146.637860643" watchObservedRunningTime="2025-09-30 19:34:53.508678006 +0000 UTC m=+146.708180126" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.538787 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.544376 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.044354452 +0000 UTC m=+147.243856582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.636394 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.639909 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.640323 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.140307222 +0000 UTC m=+147.339809362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.688011 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.741066 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.741456 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.24143619 +0000 UTC m=+147.440938320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.842300 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.843058 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.343024822 +0000 UTC m=+147.542526952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.899264 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ln7lj"] Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.943517 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:53 crc kubenswrapper[4553]: E0930 19:34:53.943935 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.443919404 +0000 UTC m=+147.643421534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.961585 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.963719 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:34:53 crc kubenswrapper[4553]: I0930 19:34:53.967474 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.044959 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.045138 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddkb" Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.045392 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.545378592 +0000 UTC m=+147.744880722 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.065475 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:54 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:54 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:54 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.065517 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.155648 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.156194 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.65616379 +0000 UTC m=+147.855665920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.237227 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b76da4d2-c7b8-4b2d-9049-09054319874f","Type":"ContainerStarted","Data":"3c75b8ddcbc0ea075f2d206bdfd3474a4e12b869bc386783ea2423edcff1a3e8"} Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.244469 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln7lj" event={"ID":"48844b8a-7077-4916-8e11-d21992f206e0","Type":"ContainerStarted","Data":"1e61dc84ed2c70b17f0b5c17f417a512180ab8ea35c4748312ed04f1c5b60a8a"} Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.257327 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.257674 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.757662458 +0000 UTC m=+147.957164588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.361679 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.362146 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.862130236 +0000 UTC m=+148.061632366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.362590 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.363414 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.863402471 +0000 UTC m=+148.062904601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.439616 4553 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.463739 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.463928 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.963902612 +0000 UTC m=+148.163404742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.464128 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.464454 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:54.964446447 +0000 UTC m=+148.163948577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: W0930 19:34:54.544873 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75d78589_abdf_4d0e_a6d2_6649c506a9aa.slice/crio-46a56a13efe8f671cc4e7af3ba4cb0c18709a59c6047078950738eaef0a66969 WatchSource:0}: Error finding container 46a56a13efe8f671cc4e7af3ba4cb0c18709a59c6047078950738eaef0a66969: Status 404 returned error can't find the container with id 46a56a13efe8f671cc4e7af3ba4cb0c18709a59c6047078950738eaef0a66969 Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.559516 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vprtz"] Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.566411 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.566726 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 19:34:55.066708246 +0000 UTC m=+148.266210376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.661952 4553 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T19:34:54.439652753Z","Handler":null,"Name":""} Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.667857 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:54 crc kubenswrapper[4553]: E0930 19:34:54.669508 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 19:34:55.169497479 +0000 UTC m=+148.368999609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6bpj6" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.713768 4553 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.713807 4553 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.773627 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.774726 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r9x8c"] Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.776528 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.781744 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.788097 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.794244 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.794988 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.795161 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.805756 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.817617 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.836176 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9x8c"] Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.876529 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.900984 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nh4pv"] Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.978677 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-catalog-content\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.978719 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-utilities\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.978739 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/222f71b8-90ec-44a7-93a8-2a53e30e8560-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"222f71b8-90ec-44a7-93a8-2a53e30e8560\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.978765 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw255\" (UniqueName: \"kubernetes.io/projected/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-kube-api-access-fw255\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.978807 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/222f71b8-90ec-44a7-93a8-2a53e30e8560-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"222f71b8-90ec-44a7-93a8-2a53e30e8560\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.986699 4553 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 19:34:54 crc kubenswrapper[4553]: I0930 19:34:54.986753 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.040468 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r2vvz"] Sep 30 19:34:55 crc kubenswrapper[4553]: W0930 19:34:55.057266 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3be1f54e_bb1a_4ef0_90b8_865875aa543e.slice/crio-8e282ab62e68fed10c9967674d0f0267f3439bc5760c65c7b55c473e675f44f2 WatchSource:0}: Error finding container 8e282ab62e68fed10c9967674d0f0267f3439bc5760c65c7b55c473e675f44f2: Status 404 returned error can't find the container with id 8e282ab62e68fed10c9967674d0f0267f3439bc5760c65c7b55c473e675f44f2 Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.079472 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:55 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:55 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:55 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.079535 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.080861 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-catalog-content\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.080917 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-utilities\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.080939 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/222f71b8-90ec-44a7-93a8-2a53e30e8560-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"222f71b8-90ec-44a7-93a8-2a53e30e8560\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.080973 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw255\" (UniqueName: \"kubernetes.io/projected/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-kube-api-access-fw255\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.081024 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/222f71b8-90ec-44a7-93a8-2a53e30e8560-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"222f71b8-90ec-44a7-93a8-2a53e30e8560\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.081174 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/222f71b8-90ec-44a7-93a8-2a53e30e8560-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"222f71b8-90ec-44a7-93a8-2a53e30e8560\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.081666 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-catalog-content\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.081899 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-utilities\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.108418 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/222f71b8-90ec-44a7-93a8-2a53e30e8560-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"222f71b8-90ec-44a7-93a8-2a53e30e8560\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.135001 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw255\" (UniqueName: \"kubernetes.io/projected/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-kube-api-access-fw255\") pod \"redhat-marketplace-r9x8c\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.151355 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.172048 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4btj5"] Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.173492 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.240108 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4btj5"] Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.247091 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6bpj6\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.319131 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-catalog-content\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.319302 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-utilities\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.319480 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.319568 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.319680 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn6rc\" (UniqueName: \"kubernetes.io/projected/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-kube-api-access-jn6rc\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.327159 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.330909 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.339351 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b76da4d2-c7b8-4b2d-9049-09054319874f","Type":"ContainerStarted","Data":"0b5f5dbdca029d3d27c3a6918c61844594496e72900459ea8798310565236483"} Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.376237 4553 generic.go:334] "Generic (PLEG): container finished" podID="48844b8a-7077-4916-8e11-d21992f206e0" containerID="cf463790b1a7cc2f25ec9f48ad3261f42412c4201931bfe2df819d9378308533" exitCode=0 Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.376316 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln7lj" event={"ID":"48844b8a-7077-4916-8e11-d21992f206e0","Type":"ContainerDied","Data":"cf463790b1a7cc2f25ec9f48ad3261f42412c4201931bfe2df819d9378308533"} Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.391419 4553 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.395846 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.413767 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh4pv" event={"ID":"2b3d8fa7-639e-46a6-8555-e5930dcc81c9","Type":"ContainerStarted","Data":"cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1"} Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.413821 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh4pv" event={"ID":"2b3d8fa7-639e-46a6-8555-e5930dcc81c9","Type":"ContainerStarted","Data":"e0a2c5ac5f3abd93618510462a7d8026fb1b778ec88450ad88b5232ba4bac226"} Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.424306 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.424295956 podStartE2EDuration="3.424295956s" podCreationTimestamp="2025-09-30 19:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:55.38147185 +0000 UTC m=+148.580973990" watchObservedRunningTime="2025-09-30 19:34:55.424295956 +0000 UTC m=+148.623798086" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.424461 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn6rc\" (UniqueName: \"kubernetes.io/projected/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-kube-api-access-jn6rc\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.424547 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.424579 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-catalog-content\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.424646 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-utilities\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.424685 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.427767 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-catalog-content\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.428004 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-utilities\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.435351 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.436378 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.439014 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2vvz" event={"ID":"3be1f54e-bb1a-4ef0-90b8-865875aa543e","Type":"ContainerStarted","Data":"8e282ab62e68fed10c9967674d0f0267f3439bc5760c65c7b55c473e675f44f2"} Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.446304 4553 generic.go:334] "Generic (PLEG): container finished" podID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerID="7691965aa1c228b31a8a482237536cc993e2ea7ca10564afce2a96b3a5513f2b" exitCode=0 Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.446345 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vprtz" event={"ID":"75d78589-abdf-4d0e-a6d2-6649c506a9aa","Type":"ContainerDied","Data":"7691965aa1c228b31a8a482237536cc993e2ea7ca10564afce2a96b3a5513f2b"} Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.446373 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vprtz" event={"ID":"75d78589-abdf-4d0e-a6d2-6649c506a9aa","Type":"ContainerStarted","Data":"46a56a13efe8f671cc4e7af3ba4cb0c18709a59c6047078950738eaef0a66969"} Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.464292 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.467148 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn6rc\" (UniqueName: \"kubernetes.io/projected/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-kube-api-access-jn6rc\") pod \"redhat-marketplace-4btj5\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.472531 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.512702 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.531539 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.579513 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.631839 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.641599 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/584c5bac-180e-46de-8e53-6586f27f2cea-metrics-certs\") pod \"network-metrics-daemon-swqk9\" (UID: \"584c5bac-180e-46de-8e53-6586f27f2cea\") " pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.732780 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.802222 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bm42p"] Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.803392 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.806595 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.842139 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bm42p"] Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.873569 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.930291 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swqk9" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.935626 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxdb7\" (UniqueName: \"kubernetes.io/projected/3419189e-5bb6-44e2-a087-79f44da3bb41-kube-api-access-vxdb7\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.935682 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-catalog-content\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:55 crc kubenswrapper[4553]: I0930 19:34:55.935714 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-utilities\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.040983 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-catalog-content\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.041144 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-utilities\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.041212 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxdb7\" (UniqueName: \"kubernetes.io/projected/3419189e-5bb6-44e2-a087-79f44da3bb41-kube-api-access-vxdb7\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.041824 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-catalog-content\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.042156 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-utilities\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.071348 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:56 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:56 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:56 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.071410 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.081228 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxdb7\" (UniqueName: \"kubernetes.io/projected/3419189e-5bb6-44e2-a087-79f44da3bb41-kube-api-access-vxdb7\") pod \"redhat-operators-bm42p\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.169175 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q4c98"] Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.170583 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.180221 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.199543 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q4c98"] Sep 30 19:34:56 crc kubenswrapper[4553]: W0930 19:34:56.328374 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-9314b64bc4123f2805f1bf37f8b54d154a42d7cf6100344da571a4869ef41208 WatchSource:0}: Error finding container 9314b64bc4123f2805f1bf37f8b54d154a42d7cf6100344da571a4869ef41208: Status 404 returned error can't find the container with id 9314b64bc4123f2805f1bf37f8b54d154a42d7cf6100344da571a4869ef41208 Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.340388 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6bpj6"] Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.346333 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtv7k\" (UniqueName: \"kubernetes.io/projected/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-kube-api-access-gtv7k\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.346382 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-utilities\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.346421 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-catalog-content\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.449865 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtv7k\" (UniqueName: \"kubernetes.io/projected/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-kube-api-access-gtv7k\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.450210 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-utilities\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.463282 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-catalog-content\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.450965 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-utilities\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.463833 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-catalog-content\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.467097 4553 generic.go:334] "Generic (PLEG): container finished" podID="b76da4d2-c7b8-4b2d-9049-09054319874f" containerID="0b5f5dbdca029d3d27c3a6918c61844594496e72900459ea8798310565236483" exitCode=0 Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.467164 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b76da4d2-c7b8-4b2d-9049-09054319874f","Type":"ContainerDied","Data":"0b5f5dbdca029d3d27c3a6918c61844594496e72900459ea8798310565236483"} Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.471280 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtv7k\" (UniqueName: \"kubernetes.io/projected/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-kube-api-access-gtv7k\") pod \"redhat-operators-q4c98\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.490731 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9314b64bc4123f2805f1bf37f8b54d154a42d7cf6100344da571a4869ef41208"} Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.505475 4553 generic.go:334] "Generic (PLEG): container finished" podID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerID="cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1" exitCode=0 Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.505543 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh4pv" event={"ID":"2b3d8fa7-639e-46a6-8555-e5930dcc81c9","Type":"ContainerDied","Data":"cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1"} Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.519772 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" event={"ID":"50e7e6b4-78bd-4209-bf3e-7c27662763fd","Type":"ContainerStarted","Data":"f15e7afc340fac1dc0a1b01657a4951ece1d7588a18a7bcef2839b4a0ef3e5b1"} Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.520551 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.524131 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"222f71b8-90ec-44a7-93a8-2a53e30e8560","Type":"ContainerStarted","Data":"4025168f73811af96da552f38eaeda704dd659a3fcd97d58c2f91b0c8f237de6"} Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.559511 4553 generic.go:334] "Generic (PLEG): container finished" podID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerID="0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec" exitCode=0 Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.559598 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2vvz" event={"ID":"3be1f54e-bb1a-4ef0-90b8-865875aa543e","Type":"ContainerDied","Data":"0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec"} Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.693832 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9x8c"] Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.699921 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4btj5"] Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.717710 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:56 crc kubenswrapper[4553]: W0930 19:34:56.723941 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d8b1f0_6dc8_4242_b2db_709ed240e30d.slice/crio-f3d259504a7ee92272338b5a4c8dcfbb8cb385fb1b3cea565f9d660f03c53e5d WatchSource:0}: Error finding container f3d259504a7ee92272338b5a4c8dcfbb8cb385fb1b3cea565f9d660f03c53e5d: Status 404 returned error can't find the container with id f3d259504a7ee92272338b5a4c8dcfbb8cb385fb1b3cea565f9d660f03c53e5d Sep 30 19:34:56 crc kubenswrapper[4553]: W0930 19:34:56.752775 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc597f6_5e78_4ded_b2e6_e4fb5fdf7a99.slice/crio-5fd086f050268b53667629bbfca5b4fc5cc2928075f77ca380d525b3ca00a94a WatchSource:0}: Error finding container 5fd086f050268b53667629bbfca5b4fc5cc2928075f77ca380d525b3ca00a94a: Status 404 returned error can't find the container with id 5fd086f050268b53667629bbfca5b4fc5cc2928075f77ca380d525b3ca00a94a Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.760308 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-djhpv" Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.814393 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-swqk9"] Sep 30 19:34:56 crc kubenswrapper[4553]: I0930 19:34:56.987012 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bm42p"] Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.074114 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:57 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:57 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:57 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.074174 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.448566 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q4c98"] Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.583710 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" event={"ID":"50e7e6b4-78bd-4209-bf3e-7c27662763fd","Type":"ContainerStarted","Data":"4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.584647 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.592879 4553 generic.go:334] "Generic (PLEG): container finished" podID="222f71b8-90ec-44a7-93a8-2a53e30e8560" containerID="c4c9adb87b6455baa919301c295b0cc54d61e724836767c861f7332739ca6190" exitCode=0 Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.593081 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"222f71b8-90ec-44a7-93a8-2a53e30e8560","Type":"ContainerDied","Data":"c4c9adb87b6455baa919301c295b0cc54d61e724836767c861f7332739ca6190"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.616953 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"aac7cfb72bbfe29af3adc4be4e3b3571e9d9d4f1a8c1c857f064270bd9fd03a9"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.617004 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"752c33a518d2dca8296c65f3fefb7484e5fdf4cf1a381d1e6b4f6a483f9ab8e9"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.617841 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.627991 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4c98" event={"ID":"99d1afa5-f3ac-46ec-9f99-d0fab6b82935","Type":"ContainerStarted","Data":"5fd009f2cebbcf50bdc4ae32344997351724dd9a3c6d378ebba02f9b7420a5d4"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.638416 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-swqk9" event={"ID":"584c5bac-180e-46de-8e53-6586f27f2cea","Type":"ContainerStarted","Data":"854d6460f943f31e2dc85a4f165b2fc8421609f2226daabfa81d72e93c00822f"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.661292 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm42p" event={"ID":"3419189e-5bb6-44e2-a087-79f44da3bb41","Type":"ContainerStarted","Data":"416b265dbf921c68fdf43940cad2442d6b7007691a91852563ac296dc3fca2d2"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.686543 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9c7c29ec03609943be0e14e479464e9d74971756c29f40a245b001d310beefff"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.686875 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8c6a0358ae58a3189d5849a530f4daf9706b82aaa61d3c8396eee89606064529"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.762145 4553 generic.go:334] "Generic (PLEG): container finished" podID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerID="94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135" exitCode=0 Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.762303 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4btj5" event={"ID":"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99","Type":"ContainerDied","Data":"94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.762333 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4btj5" event={"ID":"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99","Type":"ContainerStarted","Data":"5fd086f050268b53667629bbfca5b4fc5cc2928075f77ca380d525b3ca00a94a"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.787468 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"821d0e0949f0200f04327a0614966fe7b73b205b560601939c52e524dd6fa924"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.837307 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-656jw" Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.854465 4553 generic.go:334] "Generic (PLEG): container finished" podID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerID="864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142" exitCode=0 Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.857865 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9x8c" event={"ID":"f7d8b1f0-6dc8-4242-b2db-709ed240e30d","Type":"ContainerDied","Data":"864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.857912 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9x8c" event={"ID":"f7d8b1f0-6dc8-4242-b2db-709ed240e30d","Type":"ContainerStarted","Data":"f3d259504a7ee92272338b5a4c8dcfbb8cb385fb1b3cea565f9d660f03c53e5d"} Sep 30 19:34:57 crc kubenswrapper[4553]: I0930 19:34:57.860970 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" podStartSLOduration=129.860949783 podStartE2EDuration="2m9.860949783s" podCreationTimestamp="2025-09-30 19:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:57.857653324 +0000 UTC m=+151.057155454" watchObservedRunningTime="2025-09-30 19:34:57.860949783 +0000 UTC m=+151.060451913" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.068801 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:58 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:58 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:58 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.069238 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.476731 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.582989 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b76da4d2-c7b8-4b2d-9049-09054319874f-kubelet-dir\") pod \"b76da4d2-c7b8-4b2d-9049-09054319874f\" (UID: \"b76da4d2-c7b8-4b2d-9049-09054319874f\") " Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.583085 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b76da4d2-c7b8-4b2d-9049-09054319874f-kube-api-access\") pod \"b76da4d2-c7b8-4b2d-9049-09054319874f\" (UID: \"b76da4d2-c7b8-4b2d-9049-09054319874f\") " Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.583274 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b76da4d2-c7b8-4b2d-9049-09054319874f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b76da4d2-c7b8-4b2d-9049-09054319874f" (UID: "b76da4d2-c7b8-4b2d-9049-09054319874f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.583567 4553 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b76da4d2-c7b8-4b2d-9049-09054319874f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.611216 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76da4d2-c7b8-4b2d-9049-09054319874f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b76da4d2-c7b8-4b2d-9049-09054319874f" (UID: "b76da4d2-c7b8-4b2d-9049-09054319874f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.690719 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b76da4d2-c7b8-4b2d-9049-09054319874f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.892736 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b76da4d2-c7b8-4b2d-9049-09054319874f","Type":"ContainerDied","Data":"3c75b8ddcbc0ea075f2d206bdfd3474a4e12b869bc386783ea2423edcff1a3e8"} Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.892780 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c75b8ddcbc0ea075f2d206bdfd3474a4e12b869bc386783ea2423edcff1a3e8" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.892834 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.921742 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-swqk9" event={"ID":"584c5bac-180e-46de-8e53-6586f27f2cea","Type":"ContainerStarted","Data":"827d977c9b2ba80e0b0a819b4f313b16fc105f4250616c6211ba9e75e8559bfe"} Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.921793 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-swqk9" event={"ID":"584c5bac-180e-46de-8e53-6586f27f2cea","Type":"ContainerStarted","Data":"b47c3c3a5f2515fec112f703d9f2cb56993d82ae11078f2fb7569f9fe88d99fc"} Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.924799 4553 generic.go:334] "Generic (PLEG): container finished" podID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerID="3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702" exitCode=0 Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.924848 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm42p" event={"ID":"3419189e-5bb6-44e2-a087-79f44da3bb41","Type":"ContainerDied","Data":"3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702"} Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.930217 4553 generic.go:334] "Generic (PLEG): container finished" podID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerID="f469bcfdfed07a10ca27f4d70d81fd8032c797ccc68f4422fc5c0bcd3fabfda9" exitCode=0 Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.932681 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4c98" event={"ID":"99d1afa5-f3ac-46ec-9f99-d0fab6b82935","Type":"ContainerDied","Data":"f469bcfdfed07a10ca27f4d70d81fd8032c797ccc68f4422fc5c0bcd3fabfda9"} Sep 30 19:34:58 crc kubenswrapper[4553]: I0930 19:34:58.988622 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-swqk9" podStartSLOduration=131.988602877 podStartE2EDuration="2m11.988602877s" podCreationTimestamp="2025-09-30 19:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:34:58.961426299 +0000 UTC m=+152.160928429" watchObservedRunningTime="2025-09-30 19:34:58.988602877 +0000 UTC m=+152.188104997" Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.066878 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:34:59 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:34:59 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:34:59 crc kubenswrapper[4553]: healthz check failed Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.066949 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.487993 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.586723 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.586815 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.633258 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/222f71b8-90ec-44a7-93a8-2a53e30e8560-kube-api-access\") pod \"222f71b8-90ec-44a7-93a8-2a53e30e8560\" (UID: \"222f71b8-90ec-44a7-93a8-2a53e30e8560\") " Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.633429 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/222f71b8-90ec-44a7-93a8-2a53e30e8560-kubelet-dir\") pod \"222f71b8-90ec-44a7-93a8-2a53e30e8560\" (UID: \"222f71b8-90ec-44a7-93a8-2a53e30e8560\") " Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.635172 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/222f71b8-90ec-44a7-93a8-2a53e30e8560-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "222f71b8-90ec-44a7-93a8-2a53e30e8560" (UID: "222f71b8-90ec-44a7-93a8-2a53e30e8560"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.644397 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222f71b8-90ec-44a7-93a8-2a53e30e8560-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "222f71b8-90ec-44a7-93a8-2a53e30e8560" (UID: "222f71b8-90ec-44a7-93a8-2a53e30e8560"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.735067 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/222f71b8-90ec-44a7-93a8-2a53e30e8560-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 19:34:59 crc kubenswrapper[4553]: I0930 19:34:59.735135 4553 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/222f71b8-90ec-44a7-93a8-2a53e30e8560-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 19:35:00 crc kubenswrapper[4553]: I0930 19:35:00.012845 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"222f71b8-90ec-44a7-93a8-2a53e30e8560","Type":"ContainerDied","Data":"4025168f73811af96da552f38eaeda704dd659a3fcd97d58c2f91b0c8f237de6"} Sep 30 19:35:00 crc kubenswrapper[4553]: I0930 19:35:00.012906 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4025168f73811af96da552f38eaeda704dd659a3fcd97d58c2f91b0c8f237de6" Sep 30 19:35:00 crc kubenswrapper[4553]: I0930 19:35:00.013598 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 19:35:00 crc kubenswrapper[4553]: I0930 19:35:00.068309 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:35:00 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:35:00 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:35:00 crc kubenswrapper[4553]: healthz check failed Sep 30 19:35:00 crc kubenswrapper[4553]: I0930 19:35:00.068375 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.035800 4553 generic.go:334] "Generic (PLEG): container finished" podID="0ffff74f-1337-47da-907a-f0e10382509d" containerID="e7f1e50d9b1b978a8c2db674f12cae304e7107728e65a07d0393edd53b4b20e3" exitCode=0 Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.035842 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" event={"ID":"0ffff74f-1337-47da-907a-f0e10382509d","Type":"ContainerDied","Data":"e7f1e50d9b1b978a8c2db674f12cae304e7107728e65a07d0393edd53b4b20e3"} Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.063631 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:35:01 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:35:01 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:35:01 crc kubenswrapper[4553]: healthz check failed Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.063687 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.649127 4553 patch_prober.go:28] interesting pod/console-f9d7485db-6csmn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.649183 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6csmn" podUID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.969004 4553 patch_prober.go:28] interesting pod/downloads-7954f5f757-j2fv9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.969081 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-j2fv9" podUID="76d4f83a-1f82-4374-bc4d-601f752d318d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.969098 4553 patch_prober.go:28] interesting pod/downloads-7954f5f757-j2fv9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Sep 30 19:35:01 crc kubenswrapper[4553]: I0930 19:35:01.969162 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-j2fv9" podUID="76d4f83a-1f82-4374-bc4d-601f752d318d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.062790 4553 patch_prober.go:28] interesting pod/router-default-5444994796-r22xf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 19:35:02 crc kubenswrapper[4553]: [-]has-synced failed: reason withheld Sep 30 19:35:02 crc kubenswrapper[4553]: [+]process-running ok Sep 30 19:35:02 crc kubenswrapper[4553]: healthz check failed Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.062851 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r22xf" podUID="87431bdf-f949-4c35-916f-e14903939fe1" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.510695 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.581815 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ffff74f-1337-47da-907a-f0e10382509d-secret-volume\") pod \"0ffff74f-1337-47da-907a-f0e10382509d\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.582002 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2rkm\" (UniqueName: \"kubernetes.io/projected/0ffff74f-1337-47da-907a-f0e10382509d-kube-api-access-h2rkm\") pod \"0ffff74f-1337-47da-907a-f0e10382509d\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.582125 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ffff74f-1337-47da-907a-f0e10382509d-config-volume\") pod \"0ffff74f-1337-47da-907a-f0e10382509d\" (UID: \"0ffff74f-1337-47da-907a-f0e10382509d\") " Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.583350 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ffff74f-1337-47da-907a-f0e10382509d-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ffff74f-1337-47da-907a-f0e10382509d" (UID: "0ffff74f-1337-47da-907a-f0e10382509d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.593240 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffff74f-1337-47da-907a-f0e10382509d-kube-api-access-h2rkm" (OuterVolumeSpecName: "kube-api-access-h2rkm") pod "0ffff74f-1337-47da-907a-f0e10382509d" (UID: "0ffff74f-1337-47da-907a-f0e10382509d"). InnerVolumeSpecName "kube-api-access-h2rkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.603316 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ffff74f-1337-47da-907a-f0e10382509d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ffff74f-1337-47da-907a-f0e10382509d" (UID: "0ffff74f-1337-47da-907a-f0e10382509d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.684784 4553 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ffff74f-1337-47da-907a-f0e10382509d-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.684922 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2rkm\" (UniqueName: \"kubernetes.io/projected/0ffff74f-1337-47da-907a-f0e10382509d-kube-api-access-h2rkm\") on node \"crc\" DevicePath \"\"" Sep 30 19:35:02 crc kubenswrapper[4553]: I0930 19:35:02.684940 4553 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ffff74f-1337-47da-907a-f0e10382509d-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:35:03 crc kubenswrapper[4553]: I0930 19:35:03.083784 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:35:03 crc kubenswrapper[4553]: I0930 19:35:03.090259 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r22xf" Sep 30 19:35:03 crc kubenswrapper[4553]: I0930 19:35:03.094165 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" Sep 30 19:35:03 crc kubenswrapper[4553]: I0930 19:35:03.095450 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321010-wlr9z" event={"ID":"0ffff74f-1337-47da-907a-f0e10382509d","Type":"ContainerDied","Data":"8ce8018552e2beca80293e30943c8049adb29398d79ebbe19a253d8aa3f879c0"} Sep 30 19:35:03 crc kubenswrapper[4553]: I0930 19:35:03.095533 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ce8018552e2beca80293e30943c8049adb29398d79ebbe19a253d8aa3f879c0" Sep 30 19:35:11 crc kubenswrapper[4553]: I0930 19:35:11.654650 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:35:11 crc kubenswrapper[4553]: I0930 19:35:11.658992 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:35:11 crc kubenswrapper[4553]: I0930 19:35:11.975365 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-j2fv9" Sep 30 19:35:15 crc kubenswrapper[4553]: I0930 19:35:15.519029 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:35:22 crc kubenswrapper[4553]: I0930 19:35:22.732018 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-r49lm" Sep 30 19:35:29 crc kubenswrapper[4553]: I0930 19:35:29.585135 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:35:29 crc kubenswrapper[4553]: I0930 19:35:29.585991 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:35:30 crc kubenswrapper[4553]: E0930 19:35:30.904535 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 30 19:35:30 crc kubenswrapper[4553]: E0930 19:35:30.905368 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtv7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-q4c98_openshift-marketplace(99d1afa5-f3ac-46ec-9f99-d0fab6b82935): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 19:35:30 crc kubenswrapper[4553]: E0930 19:35:30.906643 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-q4c98" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" Sep 30 19:35:31 crc kubenswrapper[4553]: E0930 19:35:31.649837 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-q4c98" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" Sep 30 19:35:31 crc kubenswrapper[4553]: E0930 19:35:31.714064 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 19:35:31 crc kubenswrapper[4553]: E0930 19:35:31.714213 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fw255,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-r9x8c_openshift-marketplace(f7d8b1f0-6dc8-4242-b2db-709ed240e30d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 19:35:31 crc kubenswrapper[4553]: E0930 19:35:31.715530 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-r9x8c" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" Sep 30 19:35:31 crc kubenswrapper[4553]: E0930 19:35:31.762443 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 19:35:31 crc kubenswrapper[4553]: E0930 19:35:31.762637 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6rc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4btj5_openshift-marketplace(9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 19:35:31 crc kubenswrapper[4553]: E0930 19:35:31.763824 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4btj5" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" Sep 30 19:35:33 crc kubenswrapper[4553]: E0930 19:35:33.399103 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4btj5" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" Sep 30 19:35:33 crc kubenswrapper[4553]: E0930 19:35:33.399117 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-r9x8c" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" Sep 30 19:35:33 crc kubenswrapper[4553]: E0930 19:35:33.517833 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 19:35:33 crc kubenswrapper[4553]: E0930 19:35:33.517991 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vnkpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-r2vvz_openshift-marketplace(3be1f54e-bb1a-4ef0-90b8-865875aa543e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 19:35:33 crc kubenswrapper[4553]: E0930 19:35:33.519669 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-r2vvz" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" Sep 30 19:35:33 crc kubenswrapper[4553]: E0930 19:35:33.523680 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 19:35:33 crc kubenswrapper[4553]: E0930 19:35:33.523837 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xwlq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nh4pv_openshift-marketplace(2b3d8fa7-639e-46a6-8555-e5930dcc81c9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 19:35:33 crc kubenswrapper[4553]: E0930 19:35:33.525062 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nh4pv" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" Sep 30 19:35:34 crc kubenswrapper[4553]: I0930 19:35:34.355761 4553 generic.go:334] "Generic (PLEG): container finished" podID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerID="6d0f2880a3ace2a5c8d158644c2c8cb8b645fec76cd801d9c844691ca84dd761" exitCode=0 Sep 30 19:35:34 crc kubenswrapper[4553]: I0930 19:35:34.355837 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vprtz" event={"ID":"75d78589-abdf-4d0e-a6d2-6649c506a9aa","Type":"ContainerDied","Data":"6d0f2880a3ace2a5c8d158644c2c8cb8b645fec76cd801d9c844691ca84dd761"} Sep 30 19:35:34 crc kubenswrapper[4553]: I0930 19:35:34.364365 4553 generic.go:334] "Generic (PLEG): container finished" podID="48844b8a-7077-4916-8e11-d21992f206e0" containerID="bc47cd56ea07b3578d83c6438cc2a8c3587c6bb3c1b37f9f7d496a7ece27f694" exitCode=0 Sep 30 19:35:34 crc kubenswrapper[4553]: I0930 19:35:34.364557 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln7lj" event={"ID":"48844b8a-7077-4916-8e11-d21992f206e0","Type":"ContainerDied","Data":"bc47cd56ea07b3578d83c6438cc2a8c3587c6bb3c1b37f9f7d496a7ece27f694"} Sep 30 19:35:34 crc kubenswrapper[4553]: I0930 19:35:34.370936 4553 generic.go:334] "Generic (PLEG): container finished" podID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerID="3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d" exitCode=0 Sep 30 19:35:34 crc kubenswrapper[4553]: I0930 19:35:34.371315 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm42p" event={"ID":"3419189e-5bb6-44e2-a087-79f44da3bb41","Type":"ContainerDied","Data":"3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d"} Sep 30 19:35:34 crc kubenswrapper[4553]: E0930 19:35:34.377383 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nh4pv" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" Sep 30 19:35:34 crc kubenswrapper[4553]: E0930 19:35:34.379316 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-r2vvz" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" Sep 30 19:35:35 crc kubenswrapper[4553]: I0930 19:35:35.481076 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 19:35:39 crc kubenswrapper[4553]: I0930 19:35:39.410734 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vprtz" event={"ID":"75d78589-abdf-4d0e-a6d2-6649c506a9aa","Type":"ContainerStarted","Data":"bcb1c25d500d6763c90ac4ce66b0069a9e3b9d7bbe21e8606d1074bffd446c03"} Sep 30 19:35:39 crc kubenswrapper[4553]: I0930 19:35:39.441069 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vprtz" podStartSLOduration=5.024795326 podStartE2EDuration="47.441050218s" podCreationTimestamp="2025-09-30 19:34:52 +0000 UTC" firstStartedPulling="2025-09-30 19:34:55.447645272 +0000 UTC m=+148.647147402" lastFinishedPulling="2025-09-30 19:35:37.863900154 +0000 UTC m=+191.063402294" observedRunningTime="2025-09-30 19:35:39.438382597 +0000 UTC m=+192.637884747" watchObservedRunningTime="2025-09-30 19:35:39.441050218 +0000 UTC m=+192.640552358" Sep 30 19:35:40 crc kubenswrapper[4553]: I0930 19:35:40.423812 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln7lj" event={"ID":"48844b8a-7077-4916-8e11-d21992f206e0","Type":"ContainerStarted","Data":"b36124323b0c9089e9a9967009dd866eff0837062a22fefe74726d84cebd7c14"} Sep 30 19:35:40 crc kubenswrapper[4553]: I0930 19:35:40.441806 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm42p" event={"ID":"3419189e-5bb6-44e2-a087-79f44da3bb41","Type":"ContainerStarted","Data":"68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03"} Sep 30 19:35:40 crc kubenswrapper[4553]: I0930 19:35:40.467154 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ln7lj" podStartSLOduration=4.642586597 podStartE2EDuration="48.467138241s" podCreationTimestamp="2025-09-30 19:34:52 +0000 UTC" firstStartedPulling="2025-09-30 19:34:55.390893031 +0000 UTC m=+148.590395161" lastFinishedPulling="2025-09-30 19:35:39.215444635 +0000 UTC m=+192.414946805" observedRunningTime="2025-09-30 19:35:40.445936684 +0000 UTC m=+193.645438814" watchObservedRunningTime="2025-09-30 19:35:40.467138241 +0000 UTC m=+193.666640371" Sep 30 19:35:42 crc kubenswrapper[4553]: I0930 19:35:42.923571 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:35:42 crc kubenswrapper[4553]: I0930 19:35:42.923695 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:35:43 crc kubenswrapper[4553]: I0930 19:35:43.382721 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:35:43 crc kubenswrapper[4553]: I0930 19:35:43.406498 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bm42p" podStartSLOduration=8.043313112 podStartE2EDuration="48.406474322s" podCreationTimestamp="2025-09-30 19:34:55 +0000 UTC" firstStartedPulling="2025-09-30 19:34:58.937820687 +0000 UTC m=+152.137322807" lastFinishedPulling="2025-09-30 19:35:39.300981847 +0000 UTC m=+192.500484017" observedRunningTime="2025-09-30 19:35:40.469986078 +0000 UTC m=+193.669488198" watchObservedRunningTime="2025-09-30 19:35:43.406474322 +0000 UTC m=+196.605976462" Sep 30 19:35:43 crc kubenswrapper[4553]: I0930 19:35:43.638133 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:35:43 crc kubenswrapper[4553]: I0930 19:35:43.638199 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:35:43 crc kubenswrapper[4553]: I0930 19:35:43.678976 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:35:44 crc kubenswrapper[4553]: I0930 19:35:44.539751 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:35:44 crc kubenswrapper[4553]: I0930 19:35:44.551005 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:35:44 crc kubenswrapper[4553]: I0930 19:35:44.637946 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vprtz"] Sep 30 19:35:46 crc kubenswrapper[4553]: I0930 19:35:46.181526 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:35:46 crc kubenswrapper[4553]: I0930 19:35:46.182017 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:35:46 crc kubenswrapper[4553]: I0930 19:35:46.240537 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:35:46 crc kubenswrapper[4553]: I0930 19:35:46.485828 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vprtz" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerName="registry-server" containerID="cri-o://bcb1c25d500d6763c90ac4ce66b0069a9e3b9d7bbe21e8606d1074bffd446c03" gracePeriod=2 Sep 30 19:35:46 crc kubenswrapper[4553]: I0930 19:35:46.546931 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:35:47 crc kubenswrapper[4553]: I0930 19:35:47.501942 4553 generic.go:334] "Generic (PLEG): container finished" podID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerID="bcb1c25d500d6763c90ac4ce66b0069a9e3b9d7bbe21e8606d1074bffd446c03" exitCode=0 Sep 30 19:35:47 crc kubenswrapper[4553]: I0930 19:35:47.502637 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vprtz" event={"ID":"75d78589-abdf-4d0e-a6d2-6649c506a9aa","Type":"ContainerDied","Data":"bcb1c25d500d6763c90ac4ce66b0069a9e3b9d7bbe21e8606d1074bffd446c03"} Sep 30 19:35:47 crc kubenswrapper[4553]: I0930 19:35:47.986527 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:35:48 crc kubenswrapper[4553]: E0930 19:35:48.074029 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: parsing image configuration: Download config.json digest sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 does not match expected sha256:e3f3e6eb6c7e6c6ffc2d44f0a98e2d2621dc422f55b503909b06674ee25a832a" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 19:35:48 crc kubenswrapper[4553]: E0930 19:35:48.074236 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn6rc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4btj5_openshift-marketplace(9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99): ErrImagePull: copying system image from manifest list: parsing image configuration: Download config.json digest sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 does not match expected sha256:e3f3e6eb6c7e6c6ffc2d44f0a98e2d2621dc422f55b503909b06674ee25a832a" logger="UnhandledError" Sep 30 19:35:48 crc kubenswrapper[4553]: E0930 19:35:48.076195 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: parsing image configuration: Download config.json digest sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 does not match expected sha256:e3f3e6eb6c7e6c6ffc2d44f0a98e2d2621dc422f55b503909b06674ee25a832a\"" pod="openshift-marketplace/redhat-marketplace-4btj5" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.159681 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-catalog-content\") pod \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.159760 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwzjb\" (UniqueName: \"kubernetes.io/projected/75d78589-abdf-4d0e-a6d2-6649c506a9aa-kube-api-access-jwzjb\") pod \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.159792 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-utilities\") pod \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\" (UID: \"75d78589-abdf-4d0e-a6d2-6649c506a9aa\") " Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.160908 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-utilities" (OuterVolumeSpecName: "utilities") pod "75d78589-abdf-4d0e-a6d2-6649c506a9aa" (UID: "75d78589-abdf-4d0e-a6d2-6649c506a9aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.172683 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d78589-abdf-4d0e-a6d2-6649c506a9aa-kube-api-access-jwzjb" (OuterVolumeSpecName: "kube-api-access-jwzjb") pod "75d78589-abdf-4d0e-a6d2-6649c506a9aa" (UID: "75d78589-abdf-4d0e-a6d2-6649c506a9aa"). InnerVolumeSpecName "kube-api-access-jwzjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.209912 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75d78589-abdf-4d0e-a6d2-6649c506a9aa" (UID: "75d78589-abdf-4d0e-a6d2-6649c506a9aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.261221 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.261270 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwzjb\" (UniqueName: \"kubernetes.io/projected/75d78589-abdf-4d0e-a6d2-6649c506a9aa-kube-api-access-jwzjb\") on node \"crc\" DevicePath \"\"" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.261292 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d78589-abdf-4d0e-a6d2-6649c506a9aa-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.515737 4553 generic.go:334] "Generic (PLEG): container finished" podID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerID="91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd" exitCode=0 Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.515838 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9x8c" event={"ID":"f7d8b1f0-6dc8-4242-b2db-709ed240e30d","Type":"ContainerDied","Data":"91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd"} Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.520161 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4c98" event={"ID":"99d1afa5-f3ac-46ec-9f99-d0fab6b82935","Type":"ContainerStarted","Data":"f9d9b16233c7d7cc62bfcded49736c647e24171bae2848456e0afd354e926d21"} Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.525214 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vprtz" event={"ID":"75d78589-abdf-4d0e-a6d2-6649c506a9aa","Type":"ContainerDied","Data":"46a56a13efe8f671cc4e7af3ba4cb0c18709a59c6047078950738eaef0a66969"} Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.525264 4553 scope.go:117] "RemoveContainer" containerID="bcb1c25d500d6763c90ac4ce66b0069a9e3b9d7bbe21e8606d1074bffd446c03" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.525434 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vprtz" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.550206 4553 scope.go:117] "RemoveContainer" containerID="6d0f2880a3ace2a5c8d158644c2c8cb8b645fec76cd801d9c844691ca84dd761" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.587393 4553 scope.go:117] "RemoveContainer" containerID="7691965aa1c228b31a8a482237536cc993e2ea7ca10564afce2a96b3a5513f2b" Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.621546 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vprtz"] Sep 30 19:35:48 crc kubenswrapper[4553]: I0930 19:35:48.628308 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vprtz"] Sep 30 19:35:49 crc kubenswrapper[4553]: I0930 19:35:49.513082 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" path="/var/lib/kubelet/pods/75d78589-abdf-4d0e-a6d2-6649c506a9aa/volumes" Sep 30 19:35:49 crc kubenswrapper[4553]: I0930 19:35:49.532973 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9x8c" event={"ID":"f7d8b1f0-6dc8-4242-b2db-709ed240e30d","Type":"ContainerStarted","Data":"dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f"} Sep 30 19:35:49 crc kubenswrapper[4553]: I0930 19:35:49.536631 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2vvz" event={"ID":"3be1f54e-bb1a-4ef0-90b8-865875aa543e","Type":"ContainerStarted","Data":"22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4"} Sep 30 19:35:49 crc kubenswrapper[4553]: I0930 19:35:49.538401 4553 generic.go:334] "Generic (PLEG): container finished" podID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerID="f9d9b16233c7d7cc62bfcded49736c647e24171bae2848456e0afd354e926d21" exitCode=0 Sep 30 19:35:49 crc kubenswrapper[4553]: I0930 19:35:49.538445 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4c98" event={"ID":"99d1afa5-f3ac-46ec-9f99-d0fab6b82935","Type":"ContainerDied","Data":"f9d9b16233c7d7cc62bfcded49736c647e24171bae2848456e0afd354e926d21"} Sep 30 19:35:49 crc kubenswrapper[4553]: I0930 19:35:49.593170 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r9x8c" podStartSLOduration=4.456090819 podStartE2EDuration="55.593152745s" podCreationTimestamp="2025-09-30 19:34:54 +0000 UTC" firstStartedPulling="2025-09-30 19:34:57.86571629 +0000 UTC m=+151.065218420" lastFinishedPulling="2025-09-30 19:35:49.002778216 +0000 UTC m=+202.202280346" observedRunningTime="2025-09-30 19:35:49.566980693 +0000 UTC m=+202.766482813" watchObservedRunningTime="2025-09-30 19:35:49.593152745 +0000 UTC m=+202.792654875" Sep 30 19:35:50 crc kubenswrapper[4553]: I0930 19:35:50.549403 4553 generic.go:334] "Generic (PLEG): container finished" podID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerID="22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4" exitCode=0 Sep 30 19:35:50 crc kubenswrapper[4553]: I0930 19:35:50.549467 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2vvz" event={"ID":"3be1f54e-bb1a-4ef0-90b8-865875aa543e","Type":"ContainerDied","Data":"22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4"} Sep 30 19:35:50 crc kubenswrapper[4553]: I0930 19:35:50.552345 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4c98" event={"ID":"99d1afa5-f3ac-46ec-9f99-d0fab6b82935","Type":"ContainerStarted","Data":"4c75933a9f2ea36a8487ba927612785747cb73944db94cc02dd4bdb3e71f1d32"} Sep 30 19:35:50 crc kubenswrapper[4553]: I0930 19:35:50.563123 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh4pv" event={"ID":"2b3d8fa7-639e-46a6-8555-e5930dcc81c9","Type":"ContainerStarted","Data":"17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69"} Sep 30 19:35:50 crc kubenswrapper[4553]: I0930 19:35:50.616317 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q4c98" podStartSLOduration=3.637812076 podStartE2EDuration="54.616296572s" podCreationTimestamp="2025-09-30 19:34:56 +0000 UTC" firstStartedPulling="2025-09-30 19:34:58.968328643 +0000 UTC m=+152.167830763" lastFinishedPulling="2025-09-30 19:35:49.946813119 +0000 UTC m=+203.146315259" observedRunningTime="2025-09-30 19:35:50.606608331 +0000 UTC m=+203.806110461" watchObservedRunningTime="2025-09-30 19:35:50.616296572 +0000 UTC m=+203.815798712" Sep 30 19:35:51 crc kubenswrapper[4553]: I0930 19:35:51.569742 4553 generic.go:334] "Generic (PLEG): container finished" podID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerID="17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69" exitCode=0 Sep 30 19:35:51 crc kubenswrapper[4553]: I0930 19:35:51.569818 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh4pv" event={"ID":"2b3d8fa7-639e-46a6-8555-e5930dcc81c9","Type":"ContainerDied","Data":"17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69"} Sep 30 19:35:51 crc kubenswrapper[4553]: I0930 19:35:51.569844 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh4pv" event={"ID":"2b3d8fa7-639e-46a6-8555-e5930dcc81c9","Type":"ContainerStarted","Data":"bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584"} Sep 30 19:35:51 crc kubenswrapper[4553]: I0930 19:35:51.572528 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2vvz" event={"ID":"3be1f54e-bb1a-4ef0-90b8-865875aa543e","Type":"ContainerStarted","Data":"e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5"} Sep 30 19:35:51 crc kubenswrapper[4553]: I0930 19:35:51.593984 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nh4pv" podStartSLOduration=3.853033226 podStartE2EDuration="59.593966627s" podCreationTimestamp="2025-09-30 19:34:52 +0000 UTC" firstStartedPulling="2025-09-30 19:34:55.417885855 +0000 UTC m=+148.617387985" lastFinishedPulling="2025-09-30 19:35:51.158819256 +0000 UTC m=+204.358321386" observedRunningTime="2025-09-30 19:35:51.59034447 +0000 UTC m=+204.789846610" watchObservedRunningTime="2025-09-30 19:35:51.593966627 +0000 UTC m=+204.793468767" Sep 30 19:35:51 crc kubenswrapper[4553]: I0930 19:35:51.609148 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r2vvz" podStartSLOduration=4.206549402 podStartE2EDuration="58.609130524s" podCreationTimestamp="2025-09-30 19:34:53 +0000 UTC" firstStartedPulling="2025-09-30 19:34:56.570919039 +0000 UTC m=+149.770421169" lastFinishedPulling="2025-09-30 19:35:50.973500171 +0000 UTC m=+204.173002291" observedRunningTime="2025-09-30 19:35:51.607562352 +0000 UTC m=+204.807064482" watchObservedRunningTime="2025-09-30 19:35:51.609130524 +0000 UTC m=+204.808632654" Sep 30 19:35:53 crc kubenswrapper[4553]: I0930 19:35:53.965221 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:35:53 crc kubenswrapper[4553]: I0930 19:35:53.966723 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:35:53 crc kubenswrapper[4553]: I0930 19:35:53.968300 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:35:53 crc kubenswrapper[4553]: I0930 19:35:53.968338 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:35:54 crc kubenswrapper[4553]: I0930 19:35:54.022678 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:35:54 crc kubenswrapper[4553]: I0930 19:35:54.036494 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:35:55 crc kubenswrapper[4553]: I0930 19:35:55.399072 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:35:55 crc kubenswrapper[4553]: I0930 19:35:55.399158 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:35:55 crc kubenswrapper[4553]: I0930 19:35:55.454034 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:35:55 crc kubenswrapper[4553]: I0930 19:35:55.679944 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:35:56 crc kubenswrapper[4553]: I0930 19:35:56.521506 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:35:56 crc kubenswrapper[4553]: I0930 19:35:56.521575 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:35:56 crc kubenswrapper[4553]: I0930 19:35:56.595257 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:35:56 crc kubenswrapper[4553]: I0930 19:35:56.675399 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:35:59 crc kubenswrapper[4553]: I0930 19:35:59.585808 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:35:59 crc kubenswrapper[4553]: I0930 19:35:59.586462 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:35:59 crc kubenswrapper[4553]: I0930 19:35:59.586539 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:35:59 crc kubenswrapper[4553]: I0930 19:35:59.587462 4553 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d"} pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:35:59 crc kubenswrapper[4553]: I0930 19:35:59.587674 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" containerID="cri-o://dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d" gracePeriod=600 Sep 30 19:36:00 crc kubenswrapper[4553]: I0930 19:36:00.024659 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q4c98"] Sep 30 19:36:00 crc kubenswrapper[4553]: I0930 19:36:00.024912 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q4c98" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerName="registry-server" containerID="cri-o://4c75933a9f2ea36a8487ba927612785747cb73944db94cc02dd4bdb3e71f1d32" gracePeriod=2 Sep 30 19:36:00 crc kubenswrapper[4553]: I0930 19:36:00.648103 4553 generic.go:334] "Generic (PLEG): container finished" podID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerID="dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d" exitCode=0 Sep 30 19:36:00 crc kubenswrapper[4553]: I0930 19:36:00.648322 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerDied","Data":"dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d"} Sep 30 19:36:00 crc kubenswrapper[4553]: I0930 19:36:00.652374 4553 generic.go:334] "Generic (PLEG): container finished" podID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerID="4c75933a9f2ea36a8487ba927612785747cb73944db94cc02dd4bdb3e71f1d32" exitCode=0 Sep 30 19:36:00 crc kubenswrapper[4553]: I0930 19:36:00.652462 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4c98" event={"ID":"99d1afa5-f3ac-46ec-9f99-d0fab6b82935","Type":"ContainerDied","Data":"4c75933a9f2ea36a8487ba927612785747cb73944db94cc02dd4bdb3e71f1d32"} Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.147507 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.256227 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-catalog-content\") pod \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.256345 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtv7k\" (UniqueName: \"kubernetes.io/projected/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-kube-api-access-gtv7k\") pod \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.256388 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-utilities\") pod \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\" (UID: \"99d1afa5-f3ac-46ec-9f99-d0fab6b82935\") " Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.259091 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-utilities" (OuterVolumeSpecName: "utilities") pod "99d1afa5-f3ac-46ec-9f99-d0fab6b82935" (UID: "99d1afa5-f3ac-46ec-9f99-d0fab6b82935"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.282261 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-kube-api-access-gtv7k" (OuterVolumeSpecName: "kube-api-access-gtv7k") pod "99d1afa5-f3ac-46ec-9f99-d0fab6b82935" (UID: "99d1afa5-f3ac-46ec-9f99-d0fab6b82935"). InnerVolumeSpecName "kube-api-access-gtv7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.356806 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99d1afa5-f3ac-46ec-9f99-d0fab6b82935" (UID: "99d1afa5-f3ac-46ec-9f99-d0fab6b82935"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.359683 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.359739 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtv7k\" (UniqueName: \"kubernetes.io/projected/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-kube-api-access-gtv7k\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.359759 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99d1afa5-f3ac-46ec-9f99-d0fab6b82935-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.664381 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4c98" event={"ID":"99d1afa5-f3ac-46ec-9f99-d0fab6b82935","Type":"ContainerDied","Data":"5fd009f2cebbcf50bdc4ae32344997351724dd9a3c6d378ebba02f9b7420a5d4"} Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.664472 4553 scope.go:117] "RemoveContainer" containerID="4c75933a9f2ea36a8487ba927612785747cb73944db94cc02dd4bdb3e71f1d32" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.665981 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4c98" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.671018 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"7c114586c54354df4e3892b93d193976a14755ff2513086bcc2ebc83fbe5f06f"} Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.711632 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q4c98"] Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.713336 4553 scope.go:117] "RemoveContainer" containerID="f9d9b16233c7d7cc62bfcded49736c647e24171bae2848456e0afd354e926d21" Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.716945 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q4c98"] Sep 30 19:36:01 crc kubenswrapper[4553]: I0930 19:36:01.736208 4553 scope.go:117] "RemoveContainer" containerID="f469bcfdfed07a10ca27f4d70d81fd8032c797ccc68f4422fc5c0bcd3fabfda9" Sep 30 19:36:02 crc kubenswrapper[4553]: E0930 19:36:02.512831 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4btj5" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" Sep 30 19:36:03 crc kubenswrapper[4553]: I0930 19:36:03.513771 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" path="/var/lib/kubelet/pods/99d1afa5-f3ac-46ec-9f99-d0fab6b82935/volumes" Sep 30 19:36:04 crc kubenswrapper[4553]: I0930 19:36:04.027464 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:36:04 crc kubenswrapper[4553]: I0930 19:36:04.044959 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:36:05 crc kubenswrapper[4553]: I0930 19:36:05.557372 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2chmh"] Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.223707 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r2vvz"] Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.224733 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r2vvz" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerName="registry-server" containerID="cri-o://e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5" gracePeriod=2 Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.665924 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.713380 4553 generic.go:334] "Generic (PLEG): container finished" podID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerID="e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5" exitCode=0 Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.713455 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2vvz" event={"ID":"3be1f54e-bb1a-4ef0-90b8-865875aa543e","Type":"ContainerDied","Data":"e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5"} Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.713495 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r2vvz" event={"ID":"3be1f54e-bb1a-4ef0-90b8-865875aa543e","Type":"ContainerDied","Data":"8e282ab62e68fed10c9967674d0f0267f3439bc5760c65c7b55c473e675f44f2"} Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.713516 4553 scope.go:117] "RemoveContainer" containerID="e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.713682 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r2vvz" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.731907 4553 scope.go:117] "RemoveContainer" containerID="22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.749777 4553 scope.go:117] "RemoveContainer" containerID="0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.773779 4553 scope.go:117] "RemoveContainer" containerID="e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5" Sep 30 19:36:08 crc kubenswrapper[4553]: E0930 19:36:08.775092 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5\": container with ID starting with e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5 not found: ID does not exist" containerID="e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.775133 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5"} err="failed to get container status \"e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5\": rpc error: code = NotFound desc = could not find container \"e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5\": container with ID starting with e5ac1f059bb83364ae2aa4b2c7b5fd452a0c26dac2a8962bac35ea48e8d617f5 not found: ID does not exist" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.775158 4553 scope.go:117] "RemoveContainer" containerID="22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4" Sep 30 19:36:08 crc kubenswrapper[4553]: E0930 19:36:08.776233 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4\": container with ID starting with 22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4 not found: ID does not exist" containerID="22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.776258 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4"} err="failed to get container status \"22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4\": rpc error: code = NotFound desc = could not find container \"22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4\": container with ID starting with 22e4979cc9726fb7dddd2da19fed4022571ab069c16e88ca5d9a80e57777d0b4 not found: ID does not exist" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.776271 4553 scope.go:117] "RemoveContainer" containerID="0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec" Sep 30 19:36:08 crc kubenswrapper[4553]: E0930 19:36:08.776662 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec\": container with ID starting with 0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec not found: ID does not exist" containerID="0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.776686 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec"} err="failed to get container status \"0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec\": rpc error: code = NotFound desc = could not find container \"0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec\": container with ID starting with 0d2d437f950ff1b8983bdb71311673fee6171e363c36c2ddae8eef3c9bb3faec not found: ID does not exist" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.800319 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-catalog-content\") pod \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.800393 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-utilities\") pod \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.800424 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnkpp\" (UniqueName: \"kubernetes.io/projected/3be1f54e-bb1a-4ef0-90b8-865875aa543e-kube-api-access-vnkpp\") pod \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\" (UID: \"3be1f54e-bb1a-4ef0-90b8-865875aa543e\") " Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.802187 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-utilities" (OuterVolumeSpecName: "utilities") pod "3be1f54e-bb1a-4ef0-90b8-865875aa543e" (UID: "3be1f54e-bb1a-4ef0-90b8-865875aa543e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.808234 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be1f54e-bb1a-4ef0-90b8-865875aa543e-kube-api-access-vnkpp" (OuterVolumeSpecName: "kube-api-access-vnkpp") pod "3be1f54e-bb1a-4ef0-90b8-865875aa543e" (UID: "3be1f54e-bb1a-4ef0-90b8-865875aa543e"). InnerVolumeSpecName "kube-api-access-vnkpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.857829 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3be1f54e-bb1a-4ef0-90b8-865875aa543e" (UID: "3be1f54e-bb1a-4ef0-90b8-865875aa543e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.901961 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.902009 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnkpp\" (UniqueName: \"kubernetes.io/projected/3be1f54e-bb1a-4ef0-90b8-865875aa543e-kube-api-access-vnkpp\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:08 crc kubenswrapper[4553]: I0930 19:36:08.902022 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be1f54e-bb1a-4ef0-90b8-865875aa543e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:09 crc kubenswrapper[4553]: I0930 19:36:09.056761 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r2vvz"] Sep 30 19:36:09 crc kubenswrapper[4553]: I0930 19:36:09.069479 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r2vvz"] Sep 30 19:36:09 crc kubenswrapper[4553]: I0930 19:36:09.510339 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" path="/var/lib/kubelet/pods/3be1f54e-bb1a-4ef0-90b8-865875aa543e/volumes" Sep 30 19:36:17 crc kubenswrapper[4553]: I0930 19:36:17.763030 4553 generic.go:334] "Generic (PLEG): container finished" podID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerID="bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525" exitCode=0 Sep 30 19:36:17 crc kubenswrapper[4553]: I0930 19:36:17.763111 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4btj5" event={"ID":"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99","Type":"ContainerDied","Data":"bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525"} Sep 30 19:36:18 crc kubenswrapper[4553]: I0930 19:36:18.772797 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4btj5" event={"ID":"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99","Type":"ContainerStarted","Data":"5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6"} Sep 30 19:36:18 crc kubenswrapper[4553]: I0930 19:36:18.800081 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4btj5" podStartSLOduration=3.429663171 podStartE2EDuration="1m23.800026523s" podCreationTimestamp="2025-09-30 19:34:55 +0000 UTC" firstStartedPulling="2025-09-30 19:34:57.838394728 +0000 UTC m=+151.037896858" lastFinishedPulling="2025-09-30 19:36:18.20875808 +0000 UTC m=+231.408260210" observedRunningTime="2025-09-30 19:36:18.794663679 +0000 UTC m=+231.994165819" watchObservedRunningTime="2025-09-30 19:36:18.800026523 +0000 UTC m=+231.999528693" Sep 30 19:36:25 crc kubenswrapper[4553]: I0930 19:36:25.531930 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:36:25 crc kubenswrapper[4553]: I0930 19:36:25.532818 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:36:25 crc kubenswrapper[4553]: I0930 19:36:25.580888 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:36:25 crc kubenswrapper[4553]: I0930 19:36:25.893220 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:36:25 crc kubenswrapper[4553]: I0930 19:36:25.964148 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4btj5"] Sep 30 19:36:27 crc kubenswrapper[4553]: I0930 19:36:27.830698 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4btj5" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerName="registry-server" containerID="cri-o://5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6" gracePeriod=2 Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.186780 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.370094 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-utilities\") pod \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.370265 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn6rc\" (UniqueName: \"kubernetes.io/projected/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-kube-api-access-jn6rc\") pod \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.370310 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-catalog-content\") pod \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\" (UID: \"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99\") " Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.372303 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-utilities" (OuterVolumeSpecName: "utilities") pod "9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" (UID: "9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.376175 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-kube-api-access-jn6rc" (OuterVolumeSpecName: "kube-api-access-jn6rc") pod "9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" (UID: "9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99"). InnerVolumeSpecName "kube-api-access-jn6rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.384748 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" (UID: "9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.473281 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.473314 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn6rc\" (UniqueName: \"kubernetes.io/projected/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-kube-api-access-jn6rc\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.473328 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.844176 4553 generic.go:334] "Generic (PLEG): container finished" podID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerID="5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6" exitCode=0 Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.844250 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4btj5" event={"ID":"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99","Type":"ContainerDied","Data":"5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6"} Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.844268 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4btj5" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.844302 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4btj5" event={"ID":"9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99","Type":"ContainerDied","Data":"5fd086f050268b53667629bbfca5b4fc5cc2928075f77ca380d525b3ca00a94a"} Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.844343 4553 scope.go:117] "RemoveContainer" containerID="5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.876699 4553 scope.go:117] "RemoveContainer" containerID="bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.896966 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4btj5"] Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.918777 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4btj5"] Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.921239 4553 scope.go:117] "RemoveContainer" containerID="94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.944777 4553 scope.go:117] "RemoveContainer" containerID="5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6" Sep 30 19:36:28 crc kubenswrapper[4553]: E0930 19:36:28.945230 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6\": container with ID starting with 5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6 not found: ID does not exist" containerID="5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.945302 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6"} err="failed to get container status \"5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6\": rpc error: code = NotFound desc = could not find container \"5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6\": container with ID starting with 5380a903d1e68fa0bb9a4ec7def742a962dc12ccaa7d144e6cbf9e9363d46fa6 not found: ID does not exist" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.945343 4553 scope.go:117] "RemoveContainer" containerID="bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525" Sep 30 19:36:28 crc kubenswrapper[4553]: E0930 19:36:28.945973 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525\": container with ID starting with bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525 not found: ID does not exist" containerID="bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.946034 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525"} err="failed to get container status \"bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525\": rpc error: code = NotFound desc = could not find container \"bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525\": container with ID starting with bd063d3d13805cfd2a94253958863e3cbfd67468110e1a945a86253a6ba29525 not found: ID does not exist" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.946076 4553 scope.go:117] "RemoveContainer" containerID="94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135" Sep 30 19:36:28 crc kubenswrapper[4553]: E0930 19:36:28.946458 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135\": container with ID starting with 94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135 not found: ID does not exist" containerID="94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135" Sep 30 19:36:28 crc kubenswrapper[4553]: I0930 19:36:28.946555 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135"} err="failed to get container status \"94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135\": rpc error: code = NotFound desc = could not find container \"94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135\": container with ID starting with 94c43452172f59092392b21d96ae0aa26f5399fa72f56feb9342b1192a5a5135 not found: ID does not exist" Sep 30 19:36:29 crc kubenswrapper[4553]: I0930 19:36:29.517370 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" path="/var/lib/kubelet/pods/9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99/volumes" Sep 30 19:36:30 crc kubenswrapper[4553]: I0930 19:36:30.599866 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" podUID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" containerName="oauth-openshift" containerID="cri-o://430ec297aa2731648b62b620376379bedb450f28b8f9e760bc2096d0ce7fdd37" gracePeriod=15 Sep 30 19:36:30 crc kubenswrapper[4553]: I0930 19:36:30.865753 4553 generic.go:334] "Generic (PLEG): container finished" podID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" containerID="430ec297aa2731648b62b620376379bedb450f28b8f9e760bc2096d0ce7fdd37" exitCode=0 Sep 30 19:36:30 crc kubenswrapper[4553]: I0930 19:36:30.865827 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" event={"ID":"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3","Type":"ContainerDied","Data":"430ec297aa2731648b62b620376379bedb450f28b8f9e760bc2096d0ce7fdd37"} Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.082947 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.214727 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-session\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.214896 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-router-certs\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.214955 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-ocp-branding-template\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215012 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-provider-selection\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215120 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-service-ca\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215196 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-cliconfig\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215256 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-policies\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215341 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-idp-0-file-data\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215382 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-trusted-ca-bundle\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215416 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-error\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215478 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5w8z\" (UniqueName: \"kubernetes.io/projected/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-kube-api-access-b5w8z\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215518 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-dir\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215578 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-login\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.215624 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-serving-cert\") pod \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\" (UID: \"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3\") " Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.217545 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.217834 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.218766 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.219249 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.217576 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.224575 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.226281 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-kube-api-access-b5w8z" (OuterVolumeSpecName: "kube-api-access-b5w8z") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "kube-api-access-b5w8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.226472 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.227238 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.227833 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.228611 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.228994 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.229243 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.229405 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" (UID: "8d9a156e-6cce-4b1a-ab40-dfecf384b7c3"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.316882 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.316944 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.316967 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.316991 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317013 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317032 4553 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317092 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317113 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317134 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317152 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5w8z\" (UniqueName: \"kubernetes.io/projected/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-kube-api-access-b5w8z\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317170 4553 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317188 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317207 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.317224 4553 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.875837 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" event={"ID":"8d9a156e-6cce-4b1a-ab40-dfecf384b7c3","Type":"ContainerDied","Data":"5aeffe9eec8bb2d17c85e3defe60d5c55491d98425b450cdb800d42f847279b7"} Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.875965 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2chmh" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.875950 4553 scope.go:117] "RemoveContainer" containerID="430ec297aa2731648b62b620376379bedb450f28b8f9e760bc2096d0ce7fdd37" Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.918453 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2chmh"] Sep 30 19:36:31 crc kubenswrapper[4553]: I0930 19:36:31.923478 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2chmh"] Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.466441 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-76vx8"] Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.466997 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467031 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467076 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467094 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467110 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" containerName="oauth-openshift" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467124 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" containerName="oauth-openshift" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467144 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerName="extract-content" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467159 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerName="extract-content" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467177 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222f71b8-90ec-44a7-93a8-2a53e30e8560" containerName="pruner" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467190 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="222f71b8-90ec-44a7-93a8-2a53e30e8560" containerName="pruner" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467210 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffff74f-1337-47da-907a-f0e10382509d" containerName="collect-profiles" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467223 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffff74f-1337-47da-907a-f0e10382509d" containerName="collect-profiles" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467239 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerName="extract-utilities" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467253 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerName="extract-utilities" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467282 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerName="extract-utilities" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467295 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerName="extract-utilities" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467313 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerName="extract-content" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467326 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerName="extract-content" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467343 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerName="extract-content" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467355 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerName="extract-content" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467374 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467386 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467406 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76da4d2-c7b8-4b2d-9049-09054319874f" containerName="pruner" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467417 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76da4d2-c7b8-4b2d-9049-09054319874f" containerName="pruner" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467431 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerName="extract-utilities" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467445 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerName="extract-utilities" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467460 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467472 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467494 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerName="extract-utilities" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467506 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerName="extract-utilities" Sep 30 19:36:32 crc kubenswrapper[4553]: E0930 19:36:32.467524 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerName="extract-content" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467537 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerName="extract-content" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467690 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be1f54e-bb1a-4ef0-90b8-865875aa543e" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467710 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="222f71b8-90ec-44a7-93a8-2a53e30e8560" containerName="pruner" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467723 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d1afa5-f3ac-46ec-9f99-d0fab6b82935" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467746 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" containerName="oauth-openshift" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467758 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc597f6-5e78-4ded-b2e6-e4fb5fdf7a99" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467771 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d78589-abdf-4d0e-a6d2-6649c506a9aa" containerName="registry-server" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467794 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffff74f-1337-47da-907a-f0e10382509d" containerName="collect-profiles" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.467814 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76da4d2-c7b8-4b2d-9049-09054319874f" containerName="pruner" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.468624 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.476515 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.482572 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.483081 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.483392 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.483674 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.483714 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.484115 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.484034 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.484522 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.484390 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.485413 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.489316 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.494299 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-76vx8"] Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.509437 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.550179 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.552877 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.649272 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.649384 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.649516 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.649592 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.649656 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba0a496b-0dd4-499a-8527-6a797aad8820-audit-dir\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.649922 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.650027 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.650186 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.650266 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.650324 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-audit-policies\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.650406 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57fhf\" (UniqueName: \"kubernetes.io/projected/ba0a496b-0dd4-499a-8527-6a797aad8820-kube-api-access-57fhf\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.650459 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.650516 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.650657 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.752989 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57fhf\" (UniqueName: \"kubernetes.io/projected/ba0a496b-0dd4-499a-8527-6a797aad8820-kube-api-access-57fhf\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753166 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753217 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753264 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753332 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753376 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753421 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753454 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753497 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba0a496b-0dd4-499a-8527-6a797aad8820-audit-dir\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753543 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753585 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.755517 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.755848 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.753641 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.758130 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.758277 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-audit-policies\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.759941 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-audit-policies\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.760165 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba0a496b-0dd4-499a-8527-6a797aad8820-audit-dir\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.761083 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.762183 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.765164 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.766742 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.766811 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.767841 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.768396 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.768728 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.776536 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba0a496b-0dd4-499a-8527-6a797aad8820-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.790567 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57fhf\" (UniqueName: \"kubernetes.io/projected/ba0a496b-0dd4-499a-8527-6a797aad8820-kube-api-access-57fhf\") pod \"oauth-openshift-7448d7568b-76vx8\" (UID: \"ba0a496b-0dd4-499a-8527-6a797aad8820\") " pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:32 crc kubenswrapper[4553]: I0930 19:36:32.805779 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:33 crc kubenswrapper[4553]: I0930 19:36:33.070203 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-76vx8"] Sep 30 19:36:33 crc kubenswrapper[4553]: W0930 19:36:33.079418 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba0a496b_0dd4_499a_8527_6a797aad8820.slice/crio-40dd0625988f2c94771fb39c24f1cbbee45c8e97af13877749cc9bdfd1ebd800 WatchSource:0}: Error finding container 40dd0625988f2c94771fb39c24f1cbbee45c8e97af13877749cc9bdfd1ebd800: Status 404 returned error can't find the container with id 40dd0625988f2c94771fb39c24f1cbbee45c8e97af13877749cc9bdfd1ebd800 Sep 30 19:36:33 crc kubenswrapper[4553]: I0930 19:36:33.511112 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9a156e-6cce-4b1a-ab40-dfecf384b7c3" path="/var/lib/kubelet/pods/8d9a156e-6cce-4b1a-ab40-dfecf384b7c3/volumes" Sep 30 19:36:33 crc kubenswrapper[4553]: I0930 19:36:33.892127 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" event={"ID":"ba0a496b-0dd4-499a-8527-6a797aad8820","Type":"ContainerStarted","Data":"9b95738b66c64f49b06fbe1d16c24e27a184ccc0b6336dceaebb6088f41dd70a"} Sep 30 19:36:33 crc kubenswrapper[4553]: I0930 19:36:33.892197 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" event={"ID":"ba0a496b-0dd4-499a-8527-6a797aad8820","Type":"ContainerStarted","Data":"40dd0625988f2c94771fb39c24f1cbbee45c8e97af13877749cc9bdfd1ebd800"} Sep 30 19:36:33 crc kubenswrapper[4553]: I0930 19:36:33.892491 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:33 crc kubenswrapper[4553]: I0930 19:36:33.901388 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" Sep 30 19:36:33 crc kubenswrapper[4553]: I0930 19:36:33.939472 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7448d7568b-76vx8" podStartSLOduration=28.939447265 podStartE2EDuration="28.939447265s" podCreationTimestamp="2025-09-30 19:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:36:33.932774156 +0000 UTC m=+247.132276326" watchObservedRunningTime="2025-09-30 19:36:33.939447265 +0000 UTC m=+247.138949435" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.194308 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ln7lj"] Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.195014 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ln7lj" podUID="48844b8a-7077-4916-8e11-d21992f206e0" containerName="registry-server" containerID="cri-o://b36124323b0c9089e9a9967009dd866eff0837062a22fefe74726d84cebd7c14" gracePeriod=30 Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.198570 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nh4pv"] Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.198726 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nh4pv" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerName="registry-server" containerID="cri-o://bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584" gracePeriod=30 Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.222971 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvmr2"] Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.223240 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" podUID="38703e61-9d3b-4d8d-aae8-9740c0948ceb" containerName="marketplace-operator" containerID="cri-o://c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd" gracePeriod=30 Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.226870 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9x8c"] Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.227616 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r9x8c" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerName="registry-server" containerID="cri-o://dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f" gracePeriod=30 Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.229006 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bm42p"] Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.229279 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bm42p" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerName="registry-server" containerID="cri-o://68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03" gracePeriod=30 Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.235823 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lgw6c"] Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.237446 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.258514 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vm7h\" (UniqueName: \"kubernetes.io/projected/c17292c5-31e1-4fd3-80cc-a635a1ee1348-kube-api-access-4vm7h\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.258576 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c17292c5-31e1-4fd3-80cc-a635a1ee1348-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.258619 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c17292c5-31e1-4fd3-80cc-a635a1ee1348-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.283712 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lgw6c"] Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.360545 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vm7h\" (UniqueName: \"kubernetes.io/projected/c17292c5-31e1-4fd3-80cc-a635a1ee1348-kube-api-access-4vm7h\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.360610 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c17292c5-31e1-4fd3-80cc-a635a1ee1348-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.360649 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c17292c5-31e1-4fd3-80cc-a635a1ee1348-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.361903 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c17292c5-31e1-4fd3-80cc-a635a1ee1348-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.369600 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c17292c5-31e1-4fd3-80cc-a635a1ee1348-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.384834 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vm7h\" (UniqueName: \"kubernetes.io/projected/c17292c5-31e1-4fd3-80cc-a635a1ee1348-kube-api-access-4vm7h\") pod \"marketplace-operator-79b997595-lgw6c\" (UID: \"c17292c5-31e1-4fd3-80cc-a635a1ee1348\") " pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.554986 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.638751 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.668641 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwlq5\" (UniqueName: \"kubernetes.io/projected/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-kube-api-access-xwlq5\") pod \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.668803 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-utilities\") pod \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.668831 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-catalog-content\") pod \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\" (UID: \"2b3d8fa7-639e-46a6-8555-e5930dcc81c9\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.673296 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-kube-api-access-xwlq5" (OuterVolumeSpecName: "kube-api-access-xwlq5") pod "2b3d8fa7-639e-46a6-8555-e5930dcc81c9" (UID: "2b3d8fa7-639e-46a6-8555-e5930dcc81c9"). InnerVolumeSpecName "kube-api-access-xwlq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.675351 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-utilities" (OuterVolumeSpecName: "utilities") pod "2b3d8fa7-639e-46a6-8555-e5930dcc81c9" (UID: "2b3d8fa7-639e-46a6-8555-e5930dcc81c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.760997 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b3d8fa7-639e-46a6-8555-e5930dcc81c9" (UID: "2b3d8fa7-639e-46a6-8555-e5930dcc81c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.770698 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.770723 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.770735 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwlq5\" (UniqueName: \"kubernetes.io/projected/2b3d8fa7-639e-46a6-8555-e5930dcc81c9-kube-api-access-xwlq5\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.807856 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.827893 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.871796 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-catalog-content\") pod \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.871859 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-utilities\") pod \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.871946 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-operator-metrics\") pod \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.871980 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw255\" (UniqueName: \"kubernetes.io/projected/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-kube-api-access-fw255\") pod \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\" (UID: \"f7d8b1f0-6dc8-4242-b2db-709ed240e30d\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.872024 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-trusted-ca\") pod \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.872068 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtnnb\" (UniqueName: \"kubernetes.io/projected/38703e61-9d3b-4d8d-aae8-9740c0948ceb-kube-api-access-qtnnb\") pod \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\" (UID: \"38703e61-9d3b-4d8d-aae8-9740c0948ceb\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.885714 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "38703e61-9d3b-4d8d-aae8-9740c0948ceb" (UID: "38703e61-9d3b-4d8d-aae8-9740c0948ceb"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.885808 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38703e61-9d3b-4d8d-aae8-9740c0948ceb-kube-api-access-qtnnb" (OuterVolumeSpecName: "kube-api-access-qtnnb") pod "38703e61-9d3b-4d8d-aae8-9740c0948ceb" (UID: "38703e61-9d3b-4d8d-aae8-9740c0948ceb"). InnerVolumeSpecName "kube-api-access-qtnnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.886920 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-utilities" (OuterVolumeSpecName: "utilities") pod "f7d8b1f0-6dc8-4242-b2db-709ed240e30d" (UID: "f7d8b1f0-6dc8-4242-b2db-709ed240e30d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.888418 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "38703e61-9d3b-4d8d-aae8-9740c0948ceb" (UID: "38703e61-9d3b-4d8d-aae8-9740c0948ceb"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.907310 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.917072 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-kube-api-access-fw255" (OuterVolumeSpecName: "kube-api-access-fw255") pod "f7d8b1f0-6dc8-4242-b2db-709ed240e30d" (UID: "f7d8b1f0-6dc8-4242-b2db-709ed240e30d"). InnerVolumeSpecName "kube-api-access-fw255". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.948727 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7d8b1f0-6dc8-4242-b2db-709ed240e30d" (UID: "f7d8b1f0-6dc8-4242-b2db-709ed240e30d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.972754 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxdb7\" (UniqueName: \"kubernetes.io/projected/3419189e-5bb6-44e2-a087-79f44da3bb41-kube-api-access-vxdb7\") pod \"3419189e-5bb6-44e2-a087-79f44da3bb41\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.972858 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-utilities\") pod \"3419189e-5bb6-44e2-a087-79f44da3bb41\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.972904 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-catalog-content\") pod \"3419189e-5bb6-44e2-a087-79f44da3bb41\" (UID: \"3419189e-5bb6-44e2-a087-79f44da3bb41\") " Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.973141 4553 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.973159 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw255\" (UniqueName: \"kubernetes.io/projected/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-kube-api-access-fw255\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.973168 4553 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38703e61-9d3b-4d8d-aae8-9740c0948ceb-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.973179 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtnnb\" (UniqueName: \"kubernetes.io/projected/38703e61-9d3b-4d8d-aae8-9740c0948ceb-kube-api-access-qtnnb\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.973188 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.973198 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7d8b1f0-6dc8-4242-b2db-709ed240e30d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.975765 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3419189e-5bb6-44e2-a087-79f44da3bb41-kube-api-access-vxdb7" (OuterVolumeSpecName: "kube-api-access-vxdb7") pod "3419189e-5bb6-44e2-a087-79f44da3bb41" (UID: "3419189e-5bb6-44e2-a087-79f44da3bb41"). InnerVolumeSpecName "kube-api-access-vxdb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:47 crc kubenswrapper[4553]: I0930 19:36:47.975892 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-utilities" (OuterVolumeSpecName: "utilities") pod "3419189e-5bb6-44e2-a087-79f44da3bb41" (UID: "3419189e-5bb6-44e2-a087-79f44da3bb41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.012856 4553 generic.go:334] "Generic (PLEG): container finished" podID="48844b8a-7077-4916-8e11-d21992f206e0" containerID="b36124323b0c9089e9a9967009dd866eff0837062a22fefe74726d84cebd7c14" exitCode=0 Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.012933 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln7lj" event={"ID":"48844b8a-7077-4916-8e11-d21992f206e0","Type":"ContainerDied","Data":"b36124323b0c9089e9a9967009dd866eff0837062a22fefe74726d84cebd7c14"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.024835 4553 generic.go:334] "Generic (PLEG): container finished" podID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerID="68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03" exitCode=0 Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.025113 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm42p" event={"ID":"3419189e-5bb6-44e2-a087-79f44da3bb41","Type":"ContainerDied","Data":"68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.025202 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bm42p" event={"ID":"3419189e-5bb6-44e2-a087-79f44da3bb41","Type":"ContainerDied","Data":"416b265dbf921c68fdf43940cad2442d6b7007691a91852563ac296dc3fca2d2"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.025269 4553 scope.go:117] "RemoveContainer" containerID="68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.025450 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bm42p" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.039245 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh4pv" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.039260 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh4pv" event={"ID":"2b3d8fa7-639e-46a6-8555-e5930dcc81c9","Type":"ContainerDied","Data":"bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.039145 4553 generic.go:334] "Generic (PLEG): container finished" podID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerID="bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584" exitCode=0 Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.040377 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh4pv" event={"ID":"2b3d8fa7-639e-46a6-8555-e5930dcc81c9","Type":"ContainerDied","Data":"e0a2c5ac5f3abd93618510462a7d8026fb1b778ec88450ad88b5232ba4bac226"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.049089 4553 generic.go:334] "Generic (PLEG): container finished" podID="38703e61-9d3b-4d8d-aae8-9740c0948ceb" containerID="c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd" exitCode=0 Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.049177 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.049184 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" event={"ID":"38703e61-9d3b-4d8d-aae8-9740c0948ceb","Type":"ContainerDied","Data":"c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.049217 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zvmr2" event={"ID":"38703e61-9d3b-4d8d-aae8-9740c0948ceb","Type":"ContainerDied","Data":"fd32379c8dd6ee77c35fd21304f2575a073102f0707cdf467ebb5f8fc1ff6ff6"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.060770 4553 scope.go:117] "RemoveContainer" containerID="3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.066465 4553 generic.go:334] "Generic (PLEG): container finished" podID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerID="dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f" exitCode=0 Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.066507 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9x8c" event={"ID":"f7d8b1f0-6dc8-4242-b2db-709ed240e30d","Type":"ContainerDied","Data":"dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.066535 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r9x8c" event={"ID":"f7d8b1f0-6dc8-4242-b2db-709ed240e30d","Type":"ContainerDied","Data":"f3d259504a7ee92272338b5a4c8dcfbb8cb385fb1b3cea565f9d660f03c53e5d"} Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.066591 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r9x8c" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.075794 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.076053 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxdb7\" (UniqueName: \"kubernetes.io/projected/3419189e-5bb6-44e2-a087-79f44da3bb41-kube-api-access-vxdb7\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.094125 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lgw6c"] Sep 30 19:36:48 crc kubenswrapper[4553]: W0930 19:36:48.106450 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc17292c5_31e1_4fd3_80cc_a635a1ee1348.slice/crio-e09d6c1f344551f40277af333c03b7e8f68f86298dcaa4224343260890917a15 WatchSource:0}: Error finding container e09d6c1f344551f40277af333c03b7e8f68f86298dcaa4224343260890917a15: Status 404 returned error can't find the container with id e09d6c1f344551f40277af333c03b7e8f68f86298dcaa4224343260890917a15 Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.112876 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nh4pv"] Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.116210 4553 scope.go:117] "RemoveContainer" containerID="3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.124495 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nh4pv"] Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.143660 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9x8c"] Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.147919 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r9x8c"] Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.175802 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvmr2"] Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.183139 4553 scope.go:117] "RemoveContainer" containerID="68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.184622 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03\": container with ID starting with 68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03 not found: ID does not exist" containerID="68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.184730 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03"} err="failed to get container status \"68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03\": rpc error: code = NotFound desc = could not find container \"68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03\": container with ID starting with 68b818f388e0e062f069e429416bfbe3c90046953e131c760873bc30d0cd1e03 not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.184826 4553 scope.go:117] "RemoveContainer" containerID="3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.185473 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d\": container with ID starting with 3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d not found: ID does not exist" containerID="3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.185562 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d"} err="failed to get container status \"3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d\": rpc error: code = NotFound desc = could not find container \"3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d\": container with ID starting with 3dcf7c720ee435731d3eea4582aed30b1ed66ef81e30388f85496ef86b54b16d not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.185656 4553 scope.go:117] "RemoveContainer" containerID="3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.186304 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702\": container with ID starting with 3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702 not found: ID does not exist" containerID="3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.186421 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702"} err="failed to get container status \"3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702\": rpc error: code = NotFound desc = could not find container \"3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702\": container with ID starting with 3b24b4d163bb4777e4a05256c34cabdc2ecf2f1b6aa29440feda9ca226eae702 not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.186518 4553 scope.go:117] "RemoveContainer" containerID="bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.196194 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3419189e-5bb6-44e2-a087-79f44da3bb41" (UID: "3419189e-5bb6-44e2-a087-79f44da3bb41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.199520 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zvmr2"] Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.248397 4553 scope.go:117] "RemoveContainer" containerID="17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.279448 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3419189e-5bb6-44e2-a087-79f44da3bb41-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.280984 4553 scope.go:117] "RemoveContainer" containerID="cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.309423 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.320691 4553 scope.go:117] "RemoveContainer" containerID="bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.321126 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584\": container with ID starting with bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584 not found: ID does not exist" containerID="bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.321155 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584"} err="failed to get container status \"bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584\": rpc error: code = NotFound desc = could not find container \"bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584\": container with ID starting with bec84bd9f26f4f566390eea6d14cea765dac211933d462cb1428264542975584 not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.321176 4553 scope.go:117] "RemoveContainer" containerID="17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.322196 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69\": container with ID starting with 17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69 not found: ID does not exist" containerID="17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.322218 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69"} err="failed to get container status \"17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69\": rpc error: code = NotFound desc = could not find container \"17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69\": container with ID starting with 17845fcf9a71a95d879331e9998e52ac2f16a47f0b56cfb7e56cee56d68f0f69 not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.322232 4553 scope.go:117] "RemoveContainer" containerID="cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.327371 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1\": container with ID starting with cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1 not found: ID does not exist" containerID="cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.327423 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1"} err="failed to get container status \"cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1\": rpc error: code = NotFound desc = could not find container \"cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1\": container with ID starting with cb7351d1a11995720a3ee0efb39c61f1fa87dab316074ca3ed8b30132e31e4b1 not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.327455 4553 scope.go:117] "RemoveContainer" containerID="c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.392049 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bm42p"] Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.392112 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bm42p"] Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.397817 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hllp2\" (UniqueName: \"kubernetes.io/projected/48844b8a-7077-4916-8e11-d21992f206e0-kube-api-access-hllp2\") pod \"48844b8a-7077-4916-8e11-d21992f206e0\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.397864 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-catalog-content\") pod \"48844b8a-7077-4916-8e11-d21992f206e0\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.397905 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-utilities\") pod \"48844b8a-7077-4916-8e11-d21992f206e0\" (UID: \"48844b8a-7077-4916-8e11-d21992f206e0\") " Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.401701 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-utilities" (OuterVolumeSpecName: "utilities") pod "48844b8a-7077-4916-8e11-d21992f206e0" (UID: "48844b8a-7077-4916-8e11-d21992f206e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.410812 4553 scope.go:117] "RemoveContainer" containerID="c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.412029 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48844b8a-7077-4916-8e11-d21992f206e0-kube-api-access-hllp2" (OuterVolumeSpecName: "kube-api-access-hllp2") pod "48844b8a-7077-4916-8e11-d21992f206e0" (UID: "48844b8a-7077-4916-8e11-d21992f206e0"). InnerVolumeSpecName "kube-api-access-hllp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.413362 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd\": container with ID starting with c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd not found: ID does not exist" containerID="c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.413394 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd"} err="failed to get container status \"c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd\": rpc error: code = NotFound desc = could not find container \"c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd\": container with ID starting with c2ed7190261109a50f912531cf92b7f50243bf6a782196046a1b6b1a989e1cdd not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.413416 4553 scope.go:117] "RemoveContainer" containerID="dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.431444 4553 scope.go:117] "RemoveContainer" containerID="91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.446465 4553 scope.go:117] "RemoveContainer" containerID="864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.462868 4553 scope.go:117] "RemoveContainer" containerID="dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.463225 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f\": container with ID starting with dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f not found: ID does not exist" containerID="dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.463257 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f"} err="failed to get container status \"dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f\": rpc error: code = NotFound desc = could not find container \"dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f\": container with ID starting with dd877f3d26c2b976e29407ed18120d56a18a6fd3131cc18307e618f39624280f not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.463280 4553 scope.go:117] "RemoveContainer" containerID="91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.463471 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd\": container with ID starting with 91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd not found: ID does not exist" containerID="91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.463491 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd"} err="failed to get container status \"91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd\": rpc error: code = NotFound desc = could not find container \"91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd\": container with ID starting with 91718ca6b5776394398a6b2a7314c5018f88c3e18d989f75f0b0aa82ab3a7edd not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.463504 4553 scope.go:117] "RemoveContainer" containerID="864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.465226 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142\": container with ID starting with 864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142 not found: ID does not exist" containerID="864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.465265 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142"} err="failed to get container status \"864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142\": rpc error: code = NotFound desc = could not find container \"864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142\": container with ID starting with 864cf4ea5dd9d7ade82f4e7da09dd03f2c1fe20a9292eb61bd00f4adbf3fa142 not found: ID does not exist" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.471719 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48844b8a-7077-4916-8e11-d21992f206e0" (UID: "48844b8a-7077-4916-8e11-d21992f206e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.499974 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.500008 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hllp2\" (UniqueName: \"kubernetes.io/projected/48844b8a-7077-4916-8e11-d21992f206e0-kube-api-access-hllp2\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.500018 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48844b8a-7077-4916-8e11-d21992f206e0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993080 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q6qnb"] Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993355 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerName="extract-content" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993377 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerName="extract-content" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993398 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48844b8a-7077-4916-8e11-d21992f206e0" containerName="extract-content" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993408 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="48844b8a-7077-4916-8e11-d21992f206e0" containerName="extract-content" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993423 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48844b8a-7077-4916-8e11-d21992f206e0" containerName="extract-utilities" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993435 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="48844b8a-7077-4916-8e11-d21992f206e0" containerName="extract-utilities" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993452 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerName="extract-content" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993464 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerName="extract-content" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993487 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993498 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993512 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerName="extract-content" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993522 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerName="extract-content" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993535 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerName="extract-utilities" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993545 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerName="extract-utilities" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993557 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993569 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993584 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerName="extract-utilities" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993594 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerName="extract-utilities" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993608 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38703e61-9d3b-4d8d-aae8-9740c0948ceb" containerName="marketplace-operator" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993618 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="38703e61-9d3b-4d8d-aae8-9740c0948ceb" containerName="marketplace-operator" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993631 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerName="extract-utilities" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993642 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerName="extract-utilities" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993655 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48844b8a-7077-4916-8e11-d21992f206e0" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993664 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="48844b8a-7077-4916-8e11-d21992f206e0" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: E0930 19:36:48.993678 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993688 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993818 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="48844b8a-7077-4916-8e11-d21992f206e0" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993833 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993844 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993860 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="38703e61-9d3b-4d8d-aae8-9740c0948ceb" containerName="marketplace-operator" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.993873 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" containerName="registry-server" Sep 30 19:36:48 crc kubenswrapper[4553]: I0930 19:36:48.996070 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.000160 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.010666 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q6qnb"] Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.080217 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ln7lj" event={"ID":"48844b8a-7077-4916-8e11-d21992f206e0","Type":"ContainerDied","Data":"1e61dc84ed2c70b17f0b5c17f417a512180ab8ea35c4748312ed04f1c5b60a8a"} Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.080477 4553 scope.go:117] "RemoveContainer" containerID="b36124323b0c9089e9a9967009dd866eff0837062a22fefe74726d84cebd7c14" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.080302 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ln7lj" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.086004 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" event={"ID":"c17292c5-31e1-4fd3-80cc-a635a1ee1348","Type":"ContainerStarted","Data":"98306483a6559fdf5d180e5567c07932dcbcc33dcc41fe408a84e47594f8e30d"} Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.086111 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" event={"ID":"c17292c5-31e1-4fd3-80cc-a635a1ee1348","Type":"ContainerStarted","Data":"e09d6c1f344551f40277af333c03b7e8f68f86298dcaa4224343260890917a15"} Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.086311 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.095323 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.102499 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lgw6c" podStartSLOduration=2.10248461 podStartE2EDuration="2.10248461s" podCreationTimestamp="2025-09-30 19:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:36:49.101416181 +0000 UTC m=+262.300918311" watchObservedRunningTime="2025-09-30 19:36:49.10248461 +0000 UTC m=+262.301986740" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.103330 4553 scope.go:117] "RemoveContainer" containerID="bc47cd56ea07b3578d83c6438cc2a8c3587c6bb3c1b37f9f7d496a7ece27f694" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.107776 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b8190-3a71-4b25-a654-8087bbacd1fd-catalog-content\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.108118 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qsk\" (UniqueName: \"kubernetes.io/projected/044b8190-3a71-4b25-a654-8087bbacd1fd-kube-api-access-48qsk\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.108302 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/044b8190-3a71-4b25-a654-8087bbacd1fd-utilities\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.126996 4553 scope.go:117] "RemoveContainer" containerID="cf463790b1a7cc2f25ec9f48ad3261f42412c4201931bfe2df819d9378308533" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.147943 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ln7lj"] Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.153209 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ln7lj"] Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.210075 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48qsk\" (UniqueName: \"kubernetes.io/projected/044b8190-3a71-4b25-a654-8087bbacd1fd-kube-api-access-48qsk\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.210132 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/044b8190-3a71-4b25-a654-8087bbacd1fd-utilities\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.210162 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b8190-3a71-4b25-a654-8087bbacd1fd-catalog-content\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.210812 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/044b8190-3a71-4b25-a654-8087bbacd1fd-catalog-content\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.211086 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/044b8190-3a71-4b25-a654-8087bbacd1fd-utilities\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.227268 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qsk\" (UniqueName: \"kubernetes.io/projected/044b8190-3a71-4b25-a654-8087bbacd1fd-kube-api-access-48qsk\") pod \"community-operators-q6qnb\" (UID: \"044b8190-3a71-4b25-a654-8087bbacd1fd\") " pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.315550 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.510594 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3d8fa7-639e-46a6-8555-e5930dcc81c9" path="/var/lib/kubelet/pods/2b3d8fa7-639e-46a6-8555-e5930dcc81c9/volumes" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.512270 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3419189e-5bb6-44e2-a087-79f44da3bb41" path="/var/lib/kubelet/pods/3419189e-5bb6-44e2-a087-79f44da3bb41/volumes" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.513360 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38703e61-9d3b-4d8d-aae8-9740c0948ceb" path="/var/lib/kubelet/pods/38703e61-9d3b-4d8d-aae8-9740c0948ceb/volumes" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.514512 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48844b8a-7077-4916-8e11-d21992f206e0" path="/var/lib/kubelet/pods/48844b8a-7077-4916-8e11-d21992f206e0/volumes" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.515443 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d8b1f0-6dc8-4242-b2db-709ed240e30d" path="/var/lib/kubelet/pods/f7d8b1f0-6dc8-4242-b2db-709ed240e30d/volumes" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.589760 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rsgr5"] Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.590875 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.592553 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.598274 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rsgr5"] Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.691260 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q6qnb"] Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.717116 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621efb6a-40a5-416f-a473-4bf9e8837b76-utilities\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.717422 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msdk\" (UniqueName: \"kubernetes.io/projected/621efb6a-40a5-416f-a473-4bf9e8837b76-kube-api-access-5msdk\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.717469 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621efb6a-40a5-416f-a473-4bf9e8837b76-catalog-content\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.818940 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621efb6a-40a5-416f-a473-4bf9e8837b76-utilities\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.818993 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msdk\" (UniqueName: \"kubernetes.io/projected/621efb6a-40a5-416f-a473-4bf9e8837b76-kube-api-access-5msdk\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.819023 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621efb6a-40a5-416f-a473-4bf9e8837b76-catalog-content\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.819507 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/621efb6a-40a5-416f-a473-4bf9e8837b76-catalog-content\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.819608 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/621efb6a-40a5-416f-a473-4bf9e8837b76-utilities\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.847287 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msdk\" (UniqueName: \"kubernetes.io/projected/621efb6a-40a5-416f-a473-4bf9e8837b76-kube-api-access-5msdk\") pod \"certified-operators-rsgr5\" (UID: \"621efb6a-40a5-416f-a473-4bf9e8837b76\") " pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:49 crc kubenswrapper[4553]: I0930 19:36:49.906874 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:50 crc kubenswrapper[4553]: I0930 19:36:50.092556 4553 generic.go:334] "Generic (PLEG): container finished" podID="044b8190-3a71-4b25-a654-8087bbacd1fd" containerID="d619b780efd5de4d3619bd0ff90090bb43d23beca5e8d402b4c1be2c42359c22" exitCode=0 Sep 30 19:36:50 crc kubenswrapper[4553]: I0930 19:36:50.094199 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6qnb" event={"ID":"044b8190-3a71-4b25-a654-8087bbacd1fd","Type":"ContainerDied","Data":"d619b780efd5de4d3619bd0ff90090bb43d23beca5e8d402b4c1be2c42359c22"} Sep 30 19:36:50 crc kubenswrapper[4553]: I0930 19:36:50.094227 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6qnb" event={"ID":"044b8190-3a71-4b25-a654-8087bbacd1fd","Type":"ContainerStarted","Data":"4f8f03f8de0ef9c576df52f3a0938d61973964fcb31aebce5d90b403fbe9ab2c"} Sep 30 19:36:50 crc kubenswrapper[4553]: I0930 19:36:50.309694 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rsgr5"] Sep 30 19:36:50 crc kubenswrapper[4553]: W0930 19:36:50.318689 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod621efb6a_40a5_416f_a473_4bf9e8837b76.slice/crio-f98358b016fdbe86101ed9babcad17370c0746f2ab96867d854751ac7dd6461a WatchSource:0}: Error finding container f98358b016fdbe86101ed9babcad17370c0746f2ab96867d854751ac7dd6461a: Status 404 returned error can't find the container with id f98358b016fdbe86101ed9babcad17370c0746f2ab96867d854751ac7dd6461a Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.099328 4553 generic.go:334] "Generic (PLEG): container finished" podID="621efb6a-40a5-416f-a473-4bf9e8837b76" containerID="9cd1c54270b3a7e241628adae8429059e4288efcf6a3eae56d2a7b034c66381f" exitCode=0 Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.099404 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsgr5" event={"ID":"621efb6a-40a5-416f-a473-4bf9e8837b76","Type":"ContainerDied","Data":"9cd1c54270b3a7e241628adae8429059e4288efcf6a3eae56d2a7b034c66381f"} Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.099800 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsgr5" event={"ID":"621efb6a-40a5-416f-a473-4bf9e8837b76","Type":"ContainerStarted","Data":"f98358b016fdbe86101ed9babcad17370c0746f2ab96867d854751ac7dd6461a"} Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.101473 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6qnb" event={"ID":"044b8190-3a71-4b25-a654-8087bbacd1fd","Type":"ContainerStarted","Data":"0c84257530b37b057125ebec2b73a2c88cf80eb9ecaf5edc901ae98b71182922"} Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.389516 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-58c2w"] Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.390541 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: W0930 19:36:51.403930 4553 reflector.go:561] object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb": failed to list *v1.Secret: secrets "redhat-marketplace-dockercfg-x2ctb" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Sep 30 19:36:51 crc kubenswrapper[4553]: E0930 19:36:51.403996 4553 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-x2ctb\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"redhat-marketplace-dockercfg-x2ctb\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.419591 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-58c2w"] Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.443733 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpnl4\" (UniqueName: \"kubernetes.io/projected/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-kube-api-access-qpnl4\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.443815 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-catalog-content\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.443860 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-utilities\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.545352 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-catalog-content\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.545416 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-utilities\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.545476 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpnl4\" (UniqueName: \"kubernetes.io/projected/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-kube-api-access-qpnl4\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.545964 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-utilities\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.545965 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-catalog-content\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.562925 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpnl4\" (UniqueName: \"kubernetes.io/projected/85b5e9a0-50cb-48f9-beb9-ecd2b1995370-kube-api-access-qpnl4\") pod \"redhat-marketplace-58c2w\" (UID: \"85b5e9a0-50cb-48f9-beb9-ecd2b1995370\") " pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:51 crc kubenswrapper[4553]: I0930 19:36:51.998049 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kpccw"] Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.001078 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.003008 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpccw"] Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.003314 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.050558 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b0562f-066d-4491-a3a8-5b3d36463f49-utilities\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.050594 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pscm4\" (UniqueName: \"kubernetes.io/projected/93b0562f-066d-4491-a3a8-5b3d36463f49-kube-api-access-pscm4\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.050659 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b0562f-066d-4491-a3a8-5b3d36463f49-catalog-content\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.107811 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsgr5" event={"ID":"621efb6a-40a5-416f-a473-4bf9e8837b76","Type":"ContainerStarted","Data":"69fbf48707fe3fcf751cb13a4c16f31ccec7f82bf4cbcd332fe11c4ec3e4c0c1"} Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.110168 4553 generic.go:334] "Generic (PLEG): container finished" podID="044b8190-3a71-4b25-a654-8087bbacd1fd" containerID="0c84257530b37b057125ebec2b73a2c88cf80eb9ecaf5edc901ae98b71182922" exitCode=0 Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.110223 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6qnb" event={"ID":"044b8190-3a71-4b25-a654-8087bbacd1fd","Type":"ContainerDied","Data":"0c84257530b37b057125ebec2b73a2c88cf80eb9ecaf5edc901ae98b71182922"} Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.151489 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b0562f-066d-4491-a3a8-5b3d36463f49-utilities\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.151848 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pscm4\" (UniqueName: \"kubernetes.io/projected/93b0562f-066d-4491-a3a8-5b3d36463f49-kube-api-access-pscm4\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.152075 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b0562f-066d-4491-a3a8-5b3d36463f49-catalog-content\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.152382 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b0562f-066d-4491-a3a8-5b3d36463f49-catalog-content\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.152497 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b0562f-066d-4491-a3a8-5b3d36463f49-utilities\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.167781 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pscm4\" (UniqueName: \"kubernetes.io/projected/93b0562f-066d-4491-a3a8-5b3d36463f49-kube-api-access-pscm4\") pod \"redhat-operators-kpccw\" (UID: \"93b0562f-066d-4491-a3a8-5b3d36463f49\") " pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.322115 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.712672 4553 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-marketplace/redhat-marketplace-58c2w" secret="" err="failed to sync secret cache: timed out waiting for the condition" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.713615 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.733335 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpccw"] Sep 30 19:36:52 crc kubenswrapper[4553]: W0930 19:36:52.748358 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93b0562f_066d_4491_a3a8_5b3d36463f49.slice/crio-e7c050d003bf122e5c6c4c7db9c35c2a7bd31108a57ba72d5ebb5c88c9d000c3 WatchSource:0}: Error finding container e7c050d003bf122e5c6c4c7db9c35c2a7bd31108a57ba72d5ebb5c88c9d000c3: Status 404 returned error can't find the container with id e7c050d003bf122e5c6c4c7db9c35c2a7bd31108a57ba72d5ebb5c88c9d000c3 Sep 30 19:36:52 crc kubenswrapper[4553]: I0930 19:36:52.780910 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 19:36:53 crc kubenswrapper[4553]: I0930 19:36:53.118466 4553 generic.go:334] "Generic (PLEG): container finished" podID="93b0562f-066d-4491-a3a8-5b3d36463f49" containerID="c34afbfe5dcf729160c7d7242d928b5b1097b3988501de19a2c5c5d6e2a898e7" exitCode=0 Sep 30 19:36:53 crc kubenswrapper[4553]: I0930 19:36:53.118898 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpccw" event={"ID":"93b0562f-066d-4491-a3a8-5b3d36463f49","Type":"ContainerDied","Data":"c34afbfe5dcf729160c7d7242d928b5b1097b3988501de19a2c5c5d6e2a898e7"} Sep 30 19:36:53 crc kubenswrapper[4553]: I0930 19:36:53.118925 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpccw" event={"ID":"93b0562f-066d-4491-a3a8-5b3d36463f49","Type":"ContainerStarted","Data":"e7c050d003bf122e5c6c4c7db9c35c2a7bd31108a57ba72d5ebb5c88c9d000c3"} Sep 30 19:36:53 crc kubenswrapper[4553]: I0930 19:36:53.127271 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q6qnb" event={"ID":"044b8190-3a71-4b25-a654-8087bbacd1fd","Type":"ContainerStarted","Data":"ed25df091674c8c1a3c0bf40cae172ec43358dd6fcaef4fd1b8a58985302a60c"} Sep 30 19:36:53 crc kubenswrapper[4553]: I0930 19:36:53.131465 4553 generic.go:334] "Generic (PLEG): container finished" podID="621efb6a-40a5-416f-a473-4bf9e8837b76" containerID="69fbf48707fe3fcf751cb13a4c16f31ccec7f82bf4cbcd332fe11c4ec3e4c0c1" exitCode=0 Sep 30 19:36:53 crc kubenswrapper[4553]: I0930 19:36:53.131494 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsgr5" event={"ID":"621efb6a-40a5-416f-a473-4bf9e8837b76","Type":"ContainerDied","Data":"69fbf48707fe3fcf751cb13a4c16f31ccec7f82bf4cbcd332fe11c4ec3e4c0c1"} Sep 30 19:36:53 crc kubenswrapper[4553]: I0930 19:36:53.162146 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-58c2w"] Sep 30 19:36:53 crc kubenswrapper[4553]: I0930 19:36:53.178106 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q6qnb" podStartSLOduration=2.74968177 podStartE2EDuration="5.178082893s" podCreationTimestamp="2025-09-30 19:36:48 +0000 UTC" firstStartedPulling="2025-09-30 19:36:50.097522257 +0000 UTC m=+263.297024387" lastFinishedPulling="2025-09-30 19:36:52.52592338 +0000 UTC m=+265.725425510" observedRunningTime="2025-09-30 19:36:53.174395104 +0000 UTC m=+266.373897244" watchObservedRunningTime="2025-09-30 19:36:53.178082893 +0000 UTC m=+266.377585023" Sep 30 19:36:54 crc kubenswrapper[4553]: I0930 19:36:54.138965 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsgr5" event={"ID":"621efb6a-40a5-416f-a473-4bf9e8837b76","Type":"ContainerStarted","Data":"157e607021c60e05ec7ae160fcc7466e20670fdc7b658785dd4f9bdc14baada6"} Sep 30 19:36:54 crc kubenswrapper[4553]: I0930 19:36:54.141366 4553 generic.go:334] "Generic (PLEG): container finished" podID="85b5e9a0-50cb-48f9-beb9-ecd2b1995370" containerID="906e150607b476f48b28d78c58c6f01dda258425cf8ba8ffb46232f2978c2096" exitCode=0 Sep 30 19:36:54 crc kubenswrapper[4553]: I0930 19:36:54.142517 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58c2w" event={"ID":"85b5e9a0-50cb-48f9-beb9-ecd2b1995370","Type":"ContainerDied","Data":"906e150607b476f48b28d78c58c6f01dda258425cf8ba8ffb46232f2978c2096"} Sep 30 19:36:54 crc kubenswrapper[4553]: I0930 19:36:54.142556 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58c2w" event={"ID":"85b5e9a0-50cb-48f9-beb9-ecd2b1995370","Type":"ContainerStarted","Data":"a29483f5d9dfc23db8b84a44ff3d3b2ec70f87e7efa978ebc836ad730609aef7"} Sep 30 19:36:54 crc kubenswrapper[4553]: I0930 19:36:54.158163 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rsgr5" podStartSLOduration=2.454217692 podStartE2EDuration="5.158146618s" podCreationTimestamp="2025-09-30 19:36:49 +0000 UTC" firstStartedPulling="2025-09-30 19:36:51.100728894 +0000 UTC m=+264.300231024" lastFinishedPulling="2025-09-30 19:36:53.80465782 +0000 UTC m=+267.004159950" observedRunningTime="2025-09-30 19:36:54.156291538 +0000 UTC m=+267.355793668" watchObservedRunningTime="2025-09-30 19:36:54.158146618 +0000 UTC m=+267.357648748" Sep 30 19:36:55 crc kubenswrapper[4553]: I0930 19:36:55.149396 4553 generic.go:334] "Generic (PLEG): container finished" podID="85b5e9a0-50cb-48f9-beb9-ecd2b1995370" containerID="f9a5c47bdaba3204fe5e5bfe9cc935f57c32d9957843f3e9d9008be5f430ef8b" exitCode=0 Sep 30 19:36:55 crc kubenswrapper[4553]: I0930 19:36:55.149688 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58c2w" event={"ID":"85b5e9a0-50cb-48f9-beb9-ecd2b1995370","Type":"ContainerDied","Data":"f9a5c47bdaba3204fe5e5bfe9cc935f57c32d9957843f3e9d9008be5f430ef8b"} Sep 30 19:36:55 crc kubenswrapper[4553]: I0930 19:36:55.152587 4553 generic.go:334] "Generic (PLEG): container finished" podID="93b0562f-066d-4491-a3a8-5b3d36463f49" containerID="02bcf1057e65a242433c05d52d5e5b5680c2e9c458772aaca8f93e1509ace3d2" exitCode=0 Sep 30 19:36:55 crc kubenswrapper[4553]: I0930 19:36:55.153665 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpccw" event={"ID":"93b0562f-066d-4491-a3a8-5b3d36463f49","Type":"ContainerDied","Data":"02bcf1057e65a242433c05d52d5e5b5680c2e9c458772aaca8f93e1509ace3d2"} Sep 30 19:36:57 crc kubenswrapper[4553]: I0930 19:36:57.181335 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpccw" event={"ID":"93b0562f-066d-4491-a3a8-5b3d36463f49","Type":"ContainerStarted","Data":"d00c050307dfebc8c589dba80dadff7535675166950629beccf49c4e95b41a82"} Sep 30 19:36:57 crc kubenswrapper[4553]: I0930 19:36:57.184979 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58c2w" event={"ID":"85b5e9a0-50cb-48f9-beb9-ecd2b1995370","Type":"ContainerStarted","Data":"56714c5c8571f6526c33b34175b2d9aed45fe163ebc6d3a5b24969d5b7c5675e"} Sep 30 19:36:57 crc kubenswrapper[4553]: I0930 19:36:57.210948 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kpccw" podStartSLOduration=3.606157156 podStartE2EDuration="6.210915778s" podCreationTimestamp="2025-09-30 19:36:51 +0000 UTC" firstStartedPulling="2025-09-30 19:36:53.123381803 +0000 UTC m=+266.322883933" lastFinishedPulling="2025-09-30 19:36:55.728140385 +0000 UTC m=+268.927642555" observedRunningTime="2025-09-30 19:36:57.206865099 +0000 UTC m=+270.406367229" watchObservedRunningTime="2025-09-30 19:36:57.210915778 +0000 UTC m=+270.410417908" Sep 30 19:36:57 crc kubenswrapper[4553]: I0930 19:36:57.259240 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-58c2w" podStartSLOduration=4.545902667 podStartE2EDuration="6.259220045s" podCreationTimestamp="2025-09-30 19:36:51 +0000 UTC" firstStartedPulling="2025-09-30 19:36:54.143889075 +0000 UTC m=+267.343391205" lastFinishedPulling="2025-09-30 19:36:55.857206453 +0000 UTC m=+269.056708583" observedRunningTime="2025-09-30 19:36:57.255812224 +0000 UTC m=+270.455314374" watchObservedRunningTime="2025-09-30 19:36:57.259220045 +0000 UTC m=+270.458722175" Sep 30 19:36:59 crc kubenswrapper[4553]: I0930 19:36:59.316275 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:59 crc kubenswrapper[4553]: I0930 19:36:59.316712 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:59 crc kubenswrapper[4553]: I0930 19:36:59.381557 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:36:59 crc kubenswrapper[4553]: I0930 19:36:59.907841 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:59 crc kubenswrapper[4553]: I0930 19:36:59.907936 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:36:59 crc kubenswrapper[4553]: I0930 19:36:59.956236 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:37:00 crc kubenswrapper[4553]: I0930 19:37:00.247505 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q6qnb" Sep 30 19:37:00 crc kubenswrapper[4553]: I0930 19:37:00.264234 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rsgr5" Sep 30 19:37:02 crc kubenswrapper[4553]: I0930 19:37:02.322930 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:37:02 crc kubenswrapper[4553]: I0930 19:37:02.323773 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:37:02 crc kubenswrapper[4553]: I0930 19:37:02.375302 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:37:02 crc kubenswrapper[4553]: I0930 19:37:02.715122 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:37:02 crc kubenswrapper[4553]: I0930 19:37:02.715741 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:37:02 crc kubenswrapper[4553]: I0930 19:37:02.764189 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:37:03 crc kubenswrapper[4553]: I0930 19:37:03.268871 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kpccw" Sep 30 19:37:03 crc kubenswrapper[4553]: I0930 19:37:03.268944 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-58c2w" Sep 30 19:38:29 crc kubenswrapper[4553]: I0930 19:38:29.585032 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:38:29 crc kubenswrapper[4553]: I0930 19:38:29.585589 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:38:59 crc kubenswrapper[4553]: I0930 19:38:59.586218 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:38:59 crc kubenswrapper[4553]: I0930 19:38:59.587413 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.204692 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cz969"] Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.206161 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.223123 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cz969"] Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.398397 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97e2f167-9606-487c-b373-79a53ca9eefc-trusted-ca\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.398728 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-bound-sa-token\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.398930 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97e2f167-9606-487c-b373-79a53ca9eefc-registry-certificates\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.399127 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97e2f167-9606-487c-b373-79a53ca9eefc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.399309 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.399467 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97e2f167-9606-487c-b373-79a53ca9eefc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.399608 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzgs\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-kube-api-access-mqzgs\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.399783 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-registry-tls\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.426175 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.501093 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97e2f167-9606-487c-b373-79a53ca9eefc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.501179 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzgs\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-kube-api-access-mqzgs\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.501208 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-registry-tls\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.501236 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97e2f167-9606-487c-b373-79a53ca9eefc-trusted-ca\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.501281 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-bound-sa-token\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.501317 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97e2f167-9606-487c-b373-79a53ca9eefc-registry-certificates\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.501360 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97e2f167-9606-487c-b373-79a53ca9eefc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.502389 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97e2f167-9606-487c-b373-79a53ca9eefc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.503249 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97e2f167-9606-487c-b373-79a53ca9eefc-registry-certificates\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.504299 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97e2f167-9606-487c-b373-79a53ca9eefc-trusted-ca\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.509249 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-registry-tls\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.509822 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97e2f167-9606-487c-b373-79a53ca9eefc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.525410 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-bound-sa-token\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.527740 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzgs\" (UniqueName: \"kubernetes.io/projected/97e2f167-9606-487c-b373-79a53ca9eefc-kube-api-access-mqzgs\") pod \"image-registry-66df7c8f76-cz969\" (UID: \"97e2f167-9606-487c-b373-79a53ca9eefc\") " pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:15 crc kubenswrapper[4553]: I0930 19:39:15.820824 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:16 crc kubenswrapper[4553]: I0930 19:39:16.180128 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cz969"] Sep 30 19:39:17 crc kubenswrapper[4553]: I0930 19:39:17.137230 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cz969" event={"ID":"97e2f167-9606-487c-b373-79a53ca9eefc","Type":"ContainerStarted","Data":"75bef10aef9fd0ceea77315f0a0c0762972bef7ee974bbecf5f93dfbdab33b6c"} Sep 30 19:39:17 crc kubenswrapper[4553]: I0930 19:39:17.137848 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:17 crc kubenswrapper[4553]: I0930 19:39:17.137869 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cz969" event={"ID":"97e2f167-9606-487c-b373-79a53ca9eefc","Type":"ContainerStarted","Data":"80c048310b496e442a7140bddd916c1dbfb6fa5e3ba4962e7d914eb36a449687"} Sep 30 19:39:17 crc kubenswrapper[4553]: I0930 19:39:17.177436 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cz969" podStartSLOduration=2.177397461 podStartE2EDuration="2.177397461s" podCreationTimestamp="2025-09-30 19:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:39:17.169504839 +0000 UTC m=+410.369007039" watchObservedRunningTime="2025-09-30 19:39:17.177397461 +0000 UTC m=+410.376899631" Sep 30 19:39:29 crc kubenswrapper[4553]: I0930 19:39:29.584964 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:39:29 crc kubenswrapper[4553]: I0930 19:39:29.585668 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:39:29 crc kubenswrapper[4553]: I0930 19:39:29.585733 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:39:29 crc kubenswrapper[4553]: I0930 19:39:29.586515 4553 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c114586c54354df4e3892b93d193976a14755ff2513086bcc2ebc83fbe5f06f"} pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:39:29 crc kubenswrapper[4553]: I0930 19:39:29.586611 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" containerID="cri-o://7c114586c54354df4e3892b93d193976a14755ff2513086bcc2ebc83fbe5f06f" gracePeriod=600 Sep 30 19:39:30 crc kubenswrapper[4553]: I0930 19:39:30.231295 4553 generic.go:334] "Generic (PLEG): container finished" podID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerID="7c114586c54354df4e3892b93d193976a14755ff2513086bcc2ebc83fbe5f06f" exitCode=0 Sep 30 19:39:30 crc kubenswrapper[4553]: I0930 19:39:30.231342 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerDied","Data":"7c114586c54354df4e3892b93d193976a14755ff2513086bcc2ebc83fbe5f06f"} Sep 30 19:39:30 crc kubenswrapper[4553]: I0930 19:39:30.231939 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"51154b57f12370c60080e989e52d35722515976fa625d655a2c4cbbb683003ca"} Sep 30 19:39:30 crc kubenswrapper[4553]: I0930 19:39:30.232074 4553 scope.go:117] "RemoveContainer" containerID="dbe5c1597b16b63da7edea2c1cab22a34959255ab5ac6f078ab5b41f349e0f0d" Sep 30 19:39:35 crc kubenswrapper[4553]: I0930 19:39:35.833606 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cz969" Sep 30 19:39:35 crc kubenswrapper[4553]: I0930 19:39:35.940925 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6bpj6"] Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.012795 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" podUID="50e7e6b4-78bd-4209-bf3e-7c27662763fd" containerName="registry" containerID="cri-o://4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616" gracePeriod=30 Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.395031 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.463682 4553 generic.go:334] "Generic (PLEG): container finished" podID="50e7e6b4-78bd-4209-bf3e-7c27662763fd" containerID="4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616" exitCode=0 Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.463755 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.463765 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" event={"ID":"50e7e6b4-78bd-4209-bf3e-7c27662763fd","Type":"ContainerDied","Data":"4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616"} Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.463833 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6bpj6" event={"ID":"50e7e6b4-78bd-4209-bf3e-7c27662763fd","Type":"ContainerDied","Data":"f15e7afc340fac1dc0a1b01657a4951ece1d7588a18a7bcef2839b4a0ef3e5b1"} Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.463853 4553 scope.go:117] "RemoveContainer" containerID="4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.470059 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfks5\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-kube-api-access-jfks5\") pod \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.470102 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50e7e6b4-78bd-4209-bf3e-7c27662763fd-ca-trust-extracted\") pod \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.470148 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-tls\") pod \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.470199 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-certificates\") pod \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.470226 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-bound-sa-token\") pod \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.470406 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.470458 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-trusted-ca\") pod \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.471265 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "50e7e6b4-78bd-4209-bf3e-7c27662763fd" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.471720 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50e7e6b4-78bd-4209-bf3e-7c27662763fd-installation-pull-secrets\") pod \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\" (UID: \"50e7e6b4-78bd-4209-bf3e-7c27662763fd\") " Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.471288 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "50e7e6b4-78bd-4209-bf3e-7c27662763fd" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.472124 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.472143 4553 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.478109 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "50e7e6b4-78bd-4209-bf3e-7c27662763fd" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.478828 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-kube-api-access-jfks5" (OuterVolumeSpecName: "kube-api-access-jfks5") pod "50e7e6b4-78bd-4209-bf3e-7c27662763fd" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd"). InnerVolumeSpecName "kube-api-access-jfks5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.480736 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "50e7e6b4-78bd-4209-bf3e-7c27662763fd" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.481089 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e7e6b4-78bd-4209-bf3e-7c27662763fd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "50e7e6b4-78bd-4209-bf3e-7c27662763fd" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.488371 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50e7e6b4-78bd-4209-bf3e-7c27662763fd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "50e7e6b4-78bd-4209-bf3e-7c27662763fd" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.490092 4553 scope.go:117] "RemoveContainer" containerID="4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616" Sep 30 19:40:01 crc kubenswrapper[4553]: E0930 19:40:01.490513 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616\": container with ID starting with 4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616 not found: ID does not exist" containerID="4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.490548 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616"} err="failed to get container status \"4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616\": rpc error: code = NotFound desc = could not find container \"4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616\": container with ID starting with 4ac1943685a1a234e02b0cf8b3786862c5328164394d3ae8ff52cb447a9dd616 not found: ID does not exist" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.490890 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "50e7e6b4-78bd-4209-bf3e-7c27662763fd" (UID: "50e7e6b4-78bd-4209-bf3e-7c27662763fd"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.573544 4553 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.573575 4553 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50e7e6b4-78bd-4209-bf3e-7c27662763fd-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.573633 4553 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50e7e6b4-78bd-4209-bf3e-7c27662763fd-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.573644 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfks5\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-kube-api-access-jfks5\") on node \"crc\" DevicePath \"\"" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.573653 4553 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50e7e6b4-78bd-4209-bf3e-7c27662763fd-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.801334 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6bpj6"] Sep 30 19:40:01 crc kubenswrapper[4553]: I0930 19:40:01.801421 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6bpj6"] Sep 30 19:40:03 crc kubenswrapper[4553]: I0930 19:40:03.520102 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e7e6b4-78bd-4209-bf3e-7c27662763fd" path="/var/lib/kubelet/pods/50e7e6b4-78bd-4209-bf3e-7c27662763fd/volumes" Sep 30 19:41:29 crc kubenswrapper[4553]: I0930 19:41:29.585597 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:41:29 crc kubenswrapper[4553]: I0930 19:41:29.586271 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:41:59 crc kubenswrapper[4553]: I0930 19:41:59.585211 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:41:59 crc kubenswrapper[4553]: I0930 19:41:59.586085 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.523168 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-c9dqh"] Sep 30 19:42:00 crc kubenswrapper[4553]: E0930 19:42:00.523361 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e7e6b4-78bd-4209-bf3e-7c27662763fd" containerName="registry" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.523372 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e7e6b4-78bd-4209-bf3e-7c27662763fd" containerName="registry" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.523457 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e7e6b4-78bd-4209-bf3e-7c27662763fd" containerName="registry" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.523812 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-c9dqh" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.542061 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.542269 4553 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-tjfn4" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.542412 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.551666 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-c9dqh"] Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.606625 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wr26z"] Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.607503 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wr26z" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.608727 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd74h\" (UniqueName: \"kubernetes.io/projected/f3dcc7e7-e268-44e4-bff9-83d283661835-kube-api-access-nd74h\") pod \"cert-manager-cainjector-7f985d654d-c9dqh\" (UID: \"f3dcc7e7-e268-44e4-bff9-83d283661835\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-c9dqh" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.612637 4553 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dmzxv" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.622990 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wr26z"] Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.632365 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sht4s"] Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.633173 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.636768 4553 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-q9c2n" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.649595 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sht4s"] Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.710026 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd74h\" (UniqueName: \"kubernetes.io/projected/f3dcc7e7-e268-44e4-bff9-83d283661835-kube-api-access-nd74h\") pod \"cert-manager-cainjector-7f985d654d-c9dqh\" (UID: \"f3dcc7e7-e268-44e4-bff9-83d283661835\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-c9dqh" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.710094 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlnxp\" (UniqueName: \"kubernetes.io/projected/9a08fdbf-1c77-41d2-9f95-97d8a44f709c-kube-api-access-hlnxp\") pod \"cert-manager-webhook-5655c58dd6-sht4s\" (UID: \"9a08fdbf-1c77-41d2-9f95-97d8a44f709c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.710126 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72n5\" (UniqueName: \"kubernetes.io/projected/aa85437b-afbc-4b69-8e83-a4138eb4c992-kube-api-access-v72n5\") pod \"cert-manager-5b446d88c5-wr26z\" (UID: \"aa85437b-afbc-4b69-8e83-a4138eb4c992\") " pod="cert-manager/cert-manager-5b446d88c5-wr26z" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.729918 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd74h\" (UniqueName: \"kubernetes.io/projected/f3dcc7e7-e268-44e4-bff9-83d283661835-kube-api-access-nd74h\") pod \"cert-manager-cainjector-7f985d654d-c9dqh\" (UID: \"f3dcc7e7-e268-44e4-bff9-83d283661835\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-c9dqh" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.811410 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlnxp\" (UniqueName: \"kubernetes.io/projected/9a08fdbf-1c77-41d2-9f95-97d8a44f709c-kube-api-access-hlnxp\") pod \"cert-manager-webhook-5655c58dd6-sht4s\" (UID: \"9a08fdbf-1c77-41d2-9f95-97d8a44f709c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.811686 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72n5\" (UniqueName: \"kubernetes.io/projected/aa85437b-afbc-4b69-8e83-a4138eb4c992-kube-api-access-v72n5\") pod \"cert-manager-5b446d88c5-wr26z\" (UID: \"aa85437b-afbc-4b69-8e83-a4138eb4c992\") " pod="cert-manager/cert-manager-5b446d88c5-wr26z" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.828124 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlnxp\" (UniqueName: \"kubernetes.io/projected/9a08fdbf-1c77-41d2-9f95-97d8a44f709c-kube-api-access-hlnxp\") pod \"cert-manager-webhook-5655c58dd6-sht4s\" (UID: \"9a08fdbf-1c77-41d2-9f95-97d8a44f709c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.831913 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72n5\" (UniqueName: \"kubernetes.io/projected/aa85437b-afbc-4b69-8e83-a4138eb4c992-kube-api-access-v72n5\") pod \"cert-manager-5b446d88c5-wr26z\" (UID: \"aa85437b-afbc-4b69-8e83-a4138eb4c992\") " pod="cert-manager/cert-manager-5b446d88c5-wr26z" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.842817 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-c9dqh" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.927683 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-wr26z" Sep 30 19:42:00 crc kubenswrapper[4553]: I0930 19:42:00.948229 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" Sep 30 19:42:01 crc kubenswrapper[4553]: I0930 19:42:01.167067 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-wr26z"] Sep 30 19:42:01 crc kubenswrapper[4553]: I0930 19:42:01.179169 4553 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:42:01 crc kubenswrapper[4553]: I0930 19:42:01.227322 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sht4s"] Sep 30 19:42:01 crc kubenswrapper[4553]: I0930 19:42:01.270300 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-c9dqh"] Sep 30 19:42:01 crc kubenswrapper[4553]: W0930 19:42:01.276601 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3dcc7e7_e268_44e4_bff9_83d283661835.slice/crio-273ff259b99af3bcc642906a325e5a548d6110df0b104328dcc4e0499d86d31a WatchSource:0}: Error finding container 273ff259b99af3bcc642906a325e5a548d6110df0b104328dcc4e0499d86d31a: Status 404 returned error can't find the container with id 273ff259b99af3bcc642906a325e5a548d6110df0b104328dcc4e0499d86d31a Sep 30 19:42:01 crc kubenswrapper[4553]: I0930 19:42:01.328547 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wr26z" event={"ID":"aa85437b-afbc-4b69-8e83-a4138eb4c992","Type":"ContainerStarted","Data":"d04adbdd001caf86fa633763a06c3cfed3f50be071ff6feace3b5d06af2a7bec"} Sep 30 19:42:01 crc kubenswrapper[4553]: I0930 19:42:01.329553 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" event={"ID":"9a08fdbf-1c77-41d2-9f95-97d8a44f709c","Type":"ContainerStarted","Data":"3d586521016b3a993783d371905a90d0cdde38dcf78583d30859636ae94c95ef"} Sep 30 19:42:01 crc kubenswrapper[4553]: I0930 19:42:01.330492 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-c9dqh" event={"ID":"f3dcc7e7-e268-44e4-bff9-83d283661835","Type":"ContainerStarted","Data":"273ff259b99af3bcc642906a325e5a548d6110df0b104328dcc4e0499d86d31a"} Sep 30 19:42:05 crc kubenswrapper[4553]: I0930 19:42:05.352157 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" event={"ID":"9a08fdbf-1c77-41d2-9f95-97d8a44f709c","Type":"ContainerStarted","Data":"425c42281574576d14a808f7c5a8a96b7c8a4b02608e4acf4cff27d1b53b0fcb"} Sep 30 19:42:05 crc kubenswrapper[4553]: I0930 19:42:05.352997 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" Sep 30 19:42:05 crc kubenswrapper[4553]: I0930 19:42:05.354374 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-c9dqh" event={"ID":"f3dcc7e7-e268-44e4-bff9-83d283661835","Type":"ContainerStarted","Data":"b70695fe239be9503385f1d07107514d5b09ba53a8319bb95937fd7b1bae86ab"} Sep 30 19:42:05 crc kubenswrapper[4553]: I0930 19:42:05.356471 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-wr26z" event={"ID":"aa85437b-afbc-4b69-8e83-a4138eb4c992","Type":"ContainerStarted","Data":"1c3f4b4c23fbce1d84e76b7a67fe5034c0b4ac17308d80caaa64cb145c495508"} Sep 30 19:42:05 crc kubenswrapper[4553]: I0930 19:42:05.389421 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-c9dqh" podStartSLOduration=2.105508549 podStartE2EDuration="5.389378277s" podCreationTimestamp="2025-09-30 19:42:00 +0000 UTC" firstStartedPulling="2025-09-30 19:42:01.280738605 +0000 UTC m=+574.480240735" lastFinishedPulling="2025-09-30 19:42:04.564608333 +0000 UTC m=+577.764110463" observedRunningTime="2025-09-30 19:42:05.387898147 +0000 UTC m=+578.587400287" watchObservedRunningTime="2025-09-30 19:42:05.389378277 +0000 UTC m=+578.588880417" Sep 30 19:42:05 crc kubenswrapper[4553]: I0930 19:42:05.390579 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" podStartSLOduration=2.160981157 podStartE2EDuration="5.390565779s" podCreationTimestamp="2025-09-30 19:42:00 +0000 UTC" firstStartedPulling="2025-09-30 19:42:01.242809327 +0000 UTC m=+574.442311457" lastFinishedPulling="2025-09-30 19:42:04.472393939 +0000 UTC m=+577.671896079" observedRunningTime="2025-09-30 19:42:05.373701776 +0000 UTC m=+578.573203926" watchObservedRunningTime="2025-09-30 19:42:05.390565779 +0000 UTC m=+578.590067919" Sep 30 19:42:10 crc kubenswrapper[4553]: I0930 19:42:10.954440 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-sht4s" Sep 30 19:42:10 crc kubenswrapper[4553]: I0930 19:42:10.980426 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-wr26z" podStartSLOduration=7.809587177 podStartE2EDuration="10.980398803s" podCreationTimestamp="2025-09-30 19:42:00 +0000 UTC" firstStartedPulling="2025-09-30 19:42:01.17881769 +0000 UTC m=+574.378319820" lastFinishedPulling="2025-09-30 19:42:04.349629316 +0000 UTC m=+577.549131446" observedRunningTime="2025-09-30 19:42:05.405935341 +0000 UTC m=+578.605437491" watchObservedRunningTime="2025-09-30 19:42:10.980398803 +0000 UTC m=+584.179900973" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.144567 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fmsrf"] Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.145253 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovn-controller" containerID="cri-o://e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c" gracePeriod=30 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.145306 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="nbdb" containerID="cri-o://57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f" gracePeriod=30 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.145443 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kube-rbac-proxy-node" containerID="cri-o://00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02" gracePeriod=30 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.145414 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="northd" containerID="cri-o://743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e" gracePeriod=30 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.145572 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4" gracePeriod=30 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.145386 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="sbdb" containerID="cri-o://82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416" gracePeriod=30 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.145429 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovn-acl-logging" containerID="cri-o://b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9" gracePeriod=30 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.193480 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" containerID="cri-o://1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148" gracePeriod=30 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.403948 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovnkube-controller/3.log" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.406207 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovn-acl-logging/0.log" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.406752 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovn-controller/0.log" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407396 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148" exitCode=0 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407442 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4" exitCode=0 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407453 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02" exitCode=0 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407453 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148"} Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407507 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4"} Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407519 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02"} Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407529 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9"} Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407549 4553 scope.go:117] "RemoveContainer" containerID="58d25887b59580d59aee541ff4dc770cd6ede2f5f62ac3d0e5b28abdd16bf92c" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407463 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9" exitCode=143 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407586 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c" exitCode=143 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.407654 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c"} Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.411444 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/2.log" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.411895 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/1.log" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.411934 4553 generic.go:334] "Generic (PLEG): container finished" podID="0d6b9396-3666-49a3-9d06-f764a3b39edf" containerID="81d6a88a8c1b8af5edd73d213278c902fce9950b02c0160d289373bf4061862a" exitCode=2 Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.411966 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzlwd" event={"ID":"0d6b9396-3666-49a3-9d06-f764a3b39edf","Type":"ContainerDied","Data":"81d6a88a8c1b8af5edd73d213278c902fce9950b02c0160d289373bf4061862a"} Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.412562 4553 scope.go:117] "RemoveContainer" containerID="81d6a88a8c1b8af5edd73d213278c902fce9950b02c0160d289373bf4061862a" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.412799 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vzlwd_openshift-multus(0d6b9396-3666-49a3-9d06-f764a3b39edf)\"" pod="openshift-multus/multus-vzlwd" podUID="0d6b9396-3666-49a3-9d06-f764a3b39edf" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.439114 4553 scope.go:117] "RemoveContainer" containerID="b6dc41b9c827c96a6cf2567e40b6c09a48358331418ee9f753187a7381186e93" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.527541 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovn-acl-logging/0.log" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.528122 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovn-controller/0.log" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.528553 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.573494 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-var-lib-openvswitch\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.573836 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-env-overrides\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.573911 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-openvswitch\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.573656 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574017 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574036 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-systemd\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574162 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-node-log\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574190 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-etc-openvswitch\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574225 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-bin\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574265 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-kubelet\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574297 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574308 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-netns\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574333 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574356 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz6qz\" (UniqueName: \"kubernetes.io/projected/4457466e-c6fd-4a2f-8b73-c205c50f90e3-kube-api-access-dz6qz\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574362 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574390 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-ovn-kubernetes\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574435 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovn-node-metrics-cert\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574463 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574498 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-ovn\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574524 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-systemd-units\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574566 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-log-socket\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574622 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-script-lib\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574673 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-netd\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574724 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-config\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574756 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-slash\") pod \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\" (UID: \"4457466e-c6fd-4a2f-8b73-c205c50f90e3\") " Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575179 4553 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575202 4553 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575215 4553 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575231 4553 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575243 4553 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574391 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.574412 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575207 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-node-log" (OuterVolumeSpecName: "node-log") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575272 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-log-socket" (OuterVolumeSpecName: "log-socket") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575328 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575451 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575526 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575559 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575624 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-slash" (OuterVolumeSpecName: "host-slash") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.575659 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.576109 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.576522 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.588282 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.588386 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4457466e-c6fd-4a2f-8b73-c205c50f90e3-kube-api-access-dz6qz" (OuterVolumeSpecName: "kube-api-access-dz6qz") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "kube-api-access-dz6qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.593629 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4457466e-c6fd-4a2f-8b73-c205c50f90e3" (UID: "4457466e-c6fd-4a2f-8b73-c205c50f90e3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.594367 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kfw7j"] Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.594793 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.594861 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.594915 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.594964 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595032 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="northd" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595103 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="northd" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595158 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kubecfg-setup" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595286 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kubecfg-setup" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595340 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovn-acl-logging" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595392 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovn-acl-logging" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595440 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kube-rbac-proxy-node" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595486 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kube-rbac-proxy-node" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595535 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595586 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595636 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="nbdb" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595721 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="nbdb" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595762 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovn-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595779 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovn-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595796 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="sbdb" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595810 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="sbdb" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.595822 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.595831 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596172 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="northd" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596199 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="nbdb" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596213 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596228 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596238 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="sbdb" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596248 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596260 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovn-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596276 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="kube-rbac-proxy-node" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596290 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596302 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596344 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596357 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovn-acl-logging" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.596491 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596503 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: E0930 19:42:11.596769 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.596788 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerName="ovnkube-controller" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.599104 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677008 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677138 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovnkube-config\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677174 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovn-node-metrics-cert\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677244 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677283 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-etc-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677304 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-systemd-units\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677330 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-run-netns\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677353 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-kubelet\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677378 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-systemd\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677395 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-var-lib-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677415 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-cni-bin\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677452 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-run-ovn-kubernetes\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677518 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtfrx\" (UniqueName: \"kubernetes.io/projected/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-kube-api-access-rtfrx\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677559 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-node-log\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677630 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-log-socket\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677683 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovnkube-script-lib\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677723 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-slash\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677756 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-env-overrides\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677785 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-cni-netd\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677802 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-ovn\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677883 4553 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677894 4553 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677904 4553 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677914 4553 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677923 4553 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677934 4553 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677948 4553 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677962 4553 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677972 4553 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4457466e-c6fd-4a2f-8b73-c205c50f90e3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677983 4553 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.677992 4553 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.678000 4553 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.678009 4553 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.678020 4553 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4457466e-c6fd-4a2f-8b73-c205c50f90e3-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.678030 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz6qz\" (UniqueName: \"kubernetes.io/projected/4457466e-c6fd-4a2f-8b73-c205c50f90e3-kube-api-access-dz6qz\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780087 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-slash\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780151 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-env-overrides\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780199 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-slash\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780243 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-cni-netd\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780205 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-cni-netd\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780296 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-ovn\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780329 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780347 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovnkube-config\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780394 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovn-node-metrics-cert\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780451 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780477 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-etc-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780508 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-systemd-units\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780500 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-ovn\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780564 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-run-netns\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780545 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-run-netns\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780672 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-kubelet\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780710 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-systemd\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780753 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-var-lib-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780789 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-cni-bin\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780855 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-env-overrides\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780451 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780892 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-run-ovn-kubernetes\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780930 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780942 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtfrx\" (UniqueName: \"kubernetes.io/projected/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-kube-api-access-rtfrx\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780965 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-etc-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.780981 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-node-log\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781006 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-systemd-units\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781029 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-log-socket\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781070 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-cni-bin\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781112 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-kubelet\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781114 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovnkube-script-lib\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781148 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-run-systemd\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781269 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-log-socket\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781308 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-var-lib-openvswitch\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781525 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovnkube-config\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781607 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-host-run-ovn-kubernetes\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781654 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-node-log\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.781792 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovnkube-script-lib\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.783774 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-ovn-node-metrics-cert\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.804411 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtfrx\" (UniqueName: \"kubernetes.io/projected/401941c2-b5a8-47e0-8dfc-18fbb59eeac3-kube-api-access-rtfrx\") pod \"ovnkube-node-kfw7j\" (UID: \"401941c2-b5a8-47e0-8dfc-18fbb59eeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: I0930 19:42:11.915259 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:11 crc kubenswrapper[4553]: W0930 19:42:11.947147 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod401941c2_b5a8_47e0_8dfc_18fbb59eeac3.slice/crio-dc1e50e5fe1928fcb4365c14c0b72585ce8f229e57e286ca9f1c04dde61393c5 WatchSource:0}: Error finding container dc1e50e5fe1928fcb4365c14c0b72585ce8f229e57e286ca9f1c04dde61393c5: Status 404 returned error can't find the container with id dc1e50e5fe1928fcb4365c14c0b72585ce8f229e57e286ca9f1c04dde61393c5 Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.424653 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovn-acl-logging/0.log" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427197 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fmsrf_4457466e-c6fd-4a2f-8b73-c205c50f90e3/ovn-controller/0.log" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427669 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416" exitCode=0 Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427710 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f" exitCode=0 Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427722 4553 generic.go:334] "Generic (PLEG): container finished" podID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" containerID="743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e" exitCode=0 Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427799 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416"} Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427835 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f"} Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427852 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e"} Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427866 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" event={"ID":"4457466e-c6fd-4a2f-8b73-c205c50f90e3","Type":"ContainerDied","Data":"12d3a586bfcc7f16a8463399ff40c0805db03877450618a1169429a3a8f70985"} Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.427892 4553 scope.go:117] "RemoveContainer" containerID="1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.428135 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fmsrf" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.431501 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/2.log" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.433797 4553 generic.go:334] "Generic (PLEG): container finished" podID="401941c2-b5a8-47e0-8dfc-18fbb59eeac3" containerID="aa0aad2844d443865452c443c4971a99bd200d7650c48f2b12b390aebc2b26bb" exitCode=0 Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.433851 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerDied","Data":"aa0aad2844d443865452c443c4971a99bd200d7650c48f2b12b390aebc2b26bb"} Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.433905 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"dc1e50e5fe1928fcb4365c14c0b72585ce8f229e57e286ca9f1c04dde61393c5"} Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.461450 4553 scope.go:117] "RemoveContainer" containerID="82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.522484 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fmsrf"] Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.530579 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fmsrf"] Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.540308 4553 scope.go:117] "RemoveContainer" containerID="57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.572605 4553 scope.go:117] "RemoveContainer" containerID="743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.592925 4553 scope.go:117] "RemoveContainer" containerID="528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.610916 4553 scope.go:117] "RemoveContainer" containerID="00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.633245 4553 scope.go:117] "RemoveContainer" containerID="b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.650622 4553 scope.go:117] "RemoveContainer" containerID="e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.674428 4553 scope.go:117] "RemoveContainer" containerID="e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.742930 4553 scope.go:117] "RemoveContainer" containerID="1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.746324 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148\": container with ID starting with 1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148 not found: ID does not exist" containerID="1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.746383 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148"} err="failed to get container status \"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148\": rpc error: code = NotFound desc = could not find container \"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148\": container with ID starting with 1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.746432 4553 scope.go:117] "RemoveContainer" containerID="82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.757209 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\": container with ID starting with 82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416 not found: ID does not exist" containerID="82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.757253 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416"} err="failed to get container status \"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\": rpc error: code = NotFound desc = could not find container \"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\": container with ID starting with 82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.757279 4553 scope.go:117] "RemoveContainer" containerID="57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.762363 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\": container with ID starting with 57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f not found: ID does not exist" containerID="57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.762408 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f"} err="failed to get container status \"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\": rpc error: code = NotFound desc = could not find container \"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\": container with ID starting with 57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.762430 4553 scope.go:117] "RemoveContainer" containerID="743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.763431 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\": container with ID starting with 743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e not found: ID does not exist" containerID="743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.763460 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e"} err="failed to get container status \"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\": rpc error: code = NotFound desc = could not find container \"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\": container with ID starting with 743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.763474 4553 scope.go:117] "RemoveContainer" containerID="528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.763850 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\": container with ID starting with 528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4 not found: ID does not exist" containerID="528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.763874 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4"} err="failed to get container status \"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\": rpc error: code = NotFound desc = could not find container \"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\": container with ID starting with 528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.763891 4553 scope.go:117] "RemoveContainer" containerID="00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.764234 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\": container with ID starting with 00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02 not found: ID does not exist" containerID="00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.764258 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02"} err="failed to get container status \"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\": rpc error: code = NotFound desc = could not find container \"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\": container with ID starting with 00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.764270 4553 scope.go:117] "RemoveContainer" containerID="b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.765498 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\": container with ID starting with b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9 not found: ID does not exist" containerID="b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.765541 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9"} err="failed to get container status \"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\": rpc error: code = NotFound desc = could not find container \"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\": container with ID starting with b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.765573 4553 scope.go:117] "RemoveContainer" containerID="e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.766408 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\": container with ID starting with e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c not found: ID does not exist" containerID="e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.766455 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c"} err="failed to get container status \"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\": rpc error: code = NotFound desc = could not find container \"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\": container with ID starting with e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.766470 4553 scope.go:117] "RemoveContainer" containerID="e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54" Sep 30 19:42:12 crc kubenswrapper[4553]: E0930 19:42:12.767753 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\": container with ID starting with e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54 not found: ID does not exist" containerID="e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.767798 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54"} err="failed to get container status \"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\": rpc error: code = NotFound desc = could not find container \"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\": container with ID starting with e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.767818 4553 scope.go:117] "RemoveContainer" containerID="1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.768271 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148"} err="failed to get container status \"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148\": rpc error: code = NotFound desc = could not find container \"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148\": container with ID starting with 1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.768311 4553 scope.go:117] "RemoveContainer" containerID="82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.768970 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416"} err="failed to get container status \"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\": rpc error: code = NotFound desc = could not find container \"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\": container with ID starting with 82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.769006 4553 scope.go:117] "RemoveContainer" containerID="57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.769329 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f"} err="failed to get container status \"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\": rpc error: code = NotFound desc = could not find container \"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\": container with ID starting with 57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.769367 4553 scope.go:117] "RemoveContainer" containerID="743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.769568 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e"} err="failed to get container status \"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\": rpc error: code = NotFound desc = could not find container \"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\": container with ID starting with 743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.769594 4553 scope.go:117] "RemoveContainer" containerID="528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.770122 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4"} err="failed to get container status \"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\": rpc error: code = NotFound desc = could not find container \"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\": container with ID starting with 528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.770142 4553 scope.go:117] "RemoveContainer" containerID="00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.770339 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02"} err="failed to get container status \"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\": rpc error: code = NotFound desc = could not find container \"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\": container with ID starting with 00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.770360 4553 scope.go:117] "RemoveContainer" containerID="b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.770603 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9"} err="failed to get container status \"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\": rpc error: code = NotFound desc = could not find container \"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\": container with ID starting with b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.770620 4553 scope.go:117] "RemoveContainer" containerID="e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.770943 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c"} err="failed to get container status \"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\": rpc error: code = NotFound desc = could not find container \"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\": container with ID starting with e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.770962 4553 scope.go:117] "RemoveContainer" containerID="e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.772174 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54"} err="failed to get container status \"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\": rpc error: code = NotFound desc = could not find container \"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\": container with ID starting with e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.772199 4553 scope.go:117] "RemoveContainer" containerID="1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.772582 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148"} err="failed to get container status \"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148\": rpc error: code = NotFound desc = could not find container \"1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148\": container with ID starting with 1be600e1a271b3b041baf996bcfd71e3cf433fe00b1cf04dc255fc5ece677148 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.772642 4553 scope.go:117] "RemoveContainer" containerID="82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.773062 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416"} err="failed to get container status \"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\": rpc error: code = NotFound desc = could not find container \"82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416\": container with ID starting with 82742409cefe616dcf44997fc2b02c0ab321c8022c4dcbc9a89049f4e4bea416 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.773088 4553 scope.go:117] "RemoveContainer" containerID="57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.773464 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f"} err="failed to get container status \"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\": rpc error: code = NotFound desc = could not find container \"57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f\": container with ID starting with 57f8dbf09d4c7c961396049ec3da4d9cd3e0d3aa25bf228e95abb9ab97c6e22f not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.773492 4553 scope.go:117] "RemoveContainer" containerID="743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.773718 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e"} err="failed to get container status \"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\": rpc error: code = NotFound desc = could not find container \"743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e\": container with ID starting with 743b5a32a034a3dd75dc174cc7547bfaa2a0ea97eed1d5b2cbbb07e5ee82f58e not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.773743 4553 scope.go:117] "RemoveContainer" containerID="528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.773939 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4"} err="failed to get container status \"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\": rpc error: code = NotFound desc = could not find container \"528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4\": container with ID starting with 528dcd43f3241dcca844d2136d3e0323985c7f6fa764a5bd1cc18fe8f42127b4 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.773974 4553 scope.go:117] "RemoveContainer" containerID="00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.774324 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02"} err="failed to get container status \"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\": rpc error: code = NotFound desc = could not find container \"00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02\": container with ID starting with 00bcf505ad2143b7b1d4eb34d1c3129ac8cc6d9da09426b6ad018a56a4f43e02 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.774352 4553 scope.go:117] "RemoveContainer" containerID="b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.774712 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9"} err="failed to get container status \"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\": rpc error: code = NotFound desc = could not find container \"b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9\": container with ID starting with b94dda592288ebfc11679f5e5b2968be1b494eb0aacaecc97b7e0dd56180e5e9 not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.774734 4553 scope.go:117] "RemoveContainer" containerID="e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.775126 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c"} err="failed to get container status \"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\": rpc error: code = NotFound desc = could not find container \"e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c\": container with ID starting with e8ed9b7e46db89bcdaf8b97166bd2e9205a617241fef6846c79730617d4ec66c not found: ID does not exist" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.775148 4553 scope.go:117] "RemoveContainer" containerID="e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54" Sep 30 19:42:12 crc kubenswrapper[4553]: I0930 19:42:12.775851 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54"} err="failed to get container status \"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\": rpc error: code = NotFound desc = could not find container \"e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54\": container with ID starting with e3db381b6a40962eeee2d220e4f4be920cb1d5e75b735e4eae3ee4d191b2bc54 not found: ID does not exist" Sep 30 19:42:13 crc kubenswrapper[4553]: I0930 19:42:13.447368 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"f2036147cdf5967b99c26adda72687f992b2bd9f8551b6b938c5292034cbd753"} Sep 30 19:42:13 crc kubenswrapper[4553]: I0930 19:42:13.447422 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"bc2ec47b3f3f0650e594cd7552943a80d7470f927656c62fee607a418edb487c"} Sep 30 19:42:13 crc kubenswrapper[4553]: I0930 19:42:13.447443 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"37a757f7e389e6a654efbb5a6432ac07aa29eaffb5e4efc38eb7df0798e53388"} Sep 30 19:42:13 crc kubenswrapper[4553]: I0930 19:42:13.447461 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"61665bc5158fde0b270ec348cbc83e84ed0c6dfe83b5fe89c9261cce891388ee"} Sep 30 19:42:13 crc kubenswrapper[4553]: I0930 19:42:13.447477 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"22dcd4d9065b7400161ffc945e0cb0dc56738fe01a35884d599574a70f67d0aa"} Sep 30 19:42:13 crc kubenswrapper[4553]: I0930 19:42:13.447493 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"4f0b62fe80e101ebebe21296c873d311ee6c92064ece3946bd2b8f208ade5b7c"} Sep 30 19:42:13 crc kubenswrapper[4553]: I0930 19:42:13.515479 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4457466e-c6fd-4a2f-8b73-c205c50f90e3" path="/var/lib/kubelet/pods/4457466e-c6fd-4a2f-8b73-c205c50f90e3/volumes" Sep 30 19:42:16 crc kubenswrapper[4553]: I0930 19:42:16.476575 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"cc759774d8e30fabafd3cb55e55de1e335d21551d52e3be5a81f0f465209a61b"} Sep 30 19:42:18 crc kubenswrapper[4553]: I0930 19:42:18.491645 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" event={"ID":"401941c2-b5a8-47e0-8dfc-18fbb59eeac3","Type":"ContainerStarted","Data":"fc2a5555f838c9b4e47448e35a6336fe734503f1139df43567cf1cdba48acc2a"} Sep 30 19:42:18 crc kubenswrapper[4553]: I0930 19:42:18.492202 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:18 crc kubenswrapper[4553]: I0930 19:42:18.492235 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:18 crc kubenswrapper[4553]: I0930 19:42:18.524525 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:18 crc kubenswrapper[4553]: I0930 19:42:18.528369 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" podStartSLOduration=7.528355759 podStartE2EDuration="7.528355759s" podCreationTimestamp="2025-09-30 19:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:42:18.525210206 +0000 UTC m=+591.724712336" watchObservedRunningTime="2025-09-30 19:42:18.528355759 +0000 UTC m=+591.727857889" Sep 30 19:42:19 crc kubenswrapper[4553]: I0930 19:42:19.499173 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:19 crc kubenswrapper[4553]: I0930 19:42:19.553837 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:24 crc kubenswrapper[4553]: I0930 19:42:24.504633 4553 scope.go:117] "RemoveContainer" containerID="81d6a88a8c1b8af5edd73d213278c902fce9950b02c0160d289373bf4061862a" Sep 30 19:42:24 crc kubenswrapper[4553]: E0930 19:42:24.506214 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vzlwd_openshift-multus(0d6b9396-3666-49a3-9d06-f764a3b39edf)\"" pod="openshift-multus/multus-vzlwd" podUID="0d6b9396-3666-49a3-9d06-f764a3b39edf" Sep 30 19:42:29 crc kubenswrapper[4553]: I0930 19:42:29.584888 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:42:29 crc kubenswrapper[4553]: I0930 19:42:29.585363 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:42:29 crc kubenswrapper[4553]: I0930 19:42:29.585429 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:42:29 crc kubenswrapper[4553]: I0930 19:42:29.586429 4553 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51154b57f12370c60080e989e52d35722515976fa625d655a2c4cbbb683003ca"} pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:42:29 crc kubenswrapper[4553]: I0930 19:42:29.586536 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" containerID="cri-o://51154b57f12370c60080e989e52d35722515976fa625d655a2c4cbbb683003ca" gracePeriod=600 Sep 30 19:42:30 crc kubenswrapper[4553]: I0930 19:42:30.586189 4553 generic.go:334] "Generic (PLEG): container finished" podID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerID="51154b57f12370c60080e989e52d35722515976fa625d655a2c4cbbb683003ca" exitCode=0 Sep 30 19:42:30 crc kubenswrapper[4553]: I0930 19:42:30.586286 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerDied","Data":"51154b57f12370c60080e989e52d35722515976fa625d655a2c4cbbb683003ca"} Sep 30 19:42:30 crc kubenswrapper[4553]: I0930 19:42:30.586569 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"4d0705cac6e5b952d02766c3f1729599066280437bbe55ec8f4688736bf24a4f"} Sep 30 19:42:30 crc kubenswrapper[4553]: I0930 19:42:30.586591 4553 scope.go:117] "RemoveContainer" containerID="7c114586c54354df4e3892b93d193976a14755ff2513086bcc2ebc83fbe5f06f" Sep 30 19:42:37 crc kubenswrapper[4553]: I0930 19:42:37.511130 4553 scope.go:117] "RemoveContainer" containerID="81d6a88a8c1b8af5edd73d213278c902fce9950b02c0160d289373bf4061862a" Sep 30 19:42:38 crc kubenswrapper[4553]: I0930 19:42:38.659930 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vzlwd_0d6b9396-3666-49a3-9d06-f764a3b39edf/kube-multus/2.log" Sep 30 19:42:38 crc kubenswrapper[4553]: I0930 19:42:38.660695 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vzlwd" event={"ID":"0d6b9396-3666-49a3-9d06-f764a3b39edf","Type":"ContainerStarted","Data":"da7c64deb95ef40bd1f244e05d42c5037567e30e535b9d2ed6b88dd36bac508f"} Sep 30 19:42:41 crc kubenswrapper[4553]: I0930 19:42:41.940438 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kfw7j" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.238462 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2"] Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.240465 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.242647 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.254647 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2"] Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.395732 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.396259 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.396582 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnsbl\" (UniqueName: \"kubernetes.io/projected/04da03ab-c17c-40e3-ab34-524cda37de29-kube-api-access-gnsbl\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.497662 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.497740 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnsbl\" (UniqueName: \"kubernetes.io/projected/04da03ab-c17c-40e3-ab34-524cda37de29-kube-api-access-gnsbl\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.497809 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.498632 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.498709 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.532805 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnsbl\" (UniqueName: \"kubernetes.io/projected/04da03ab-c17c-40e3-ab34-524cda37de29-kube-api-access-gnsbl\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:52 crc kubenswrapper[4553]: I0930 19:42:52.557349 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:53 crc kubenswrapper[4553]: I0930 19:42:53.049308 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2"] Sep 30 19:42:53 crc kubenswrapper[4553]: I0930 19:42:53.759956 4553 generic.go:334] "Generic (PLEG): container finished" podID="04da03ab-c17c-40e3-ab34-524cda37de29" containerID="53037977e8998ac92d3b8b5704989dae708728e106465fa67164c9757b43e2d8" exitCode=0 Sep 30 19:42:53 crc kubenswrapper[4553]: I0930 19:42:53.760134 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" event={"ID":"04da03ab-c17c-40e3-ab34-524cda37de29","Type":"ContainerDied","Data":"53037977e8998ac92d3b8b5704989dae708728e106465fa67164c9757b43e2d8"} Sep 30 19:42:53 crc kubenswrapper[4553]: I0930 19:42:53.761516 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" event={"ID":"04da03ab-c17c-40e3-ab34-524cda37de29","Type":"ContainerStarted","Data":"d60a231d4b4113b7f19639fc736675d470fa4d40f9872bc919fd8d3b5354043d"} Sep 30 19:42:55 crc kubenswrapper[4553]: I0930 19:42:55.779743 4553 generic.go:334] "Generic (PLEG): container finished" podID="04da03ab-c17c-40e3-ab34-524cda37de29" containerID="6a368b129f35115eb0dfe5b3a91dee55cd40a3e97b3290022222a91c57d918d2" exitCode=0 Sep 30 19:42:55 crc kubenswrapper[4553]: I0930 19:42:55.779844 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" event={"ID":"04da03ab-c17c-40e3-ab34-524cda37de29","Type":"ContainerDied","Data":"6a368b129f35115eb0dfe5b3a91dee55cd40a3e97b3290022222a91c57d918d2"} Sep 30 19:42:56 crc kubenswrapper[4553]: I0930 19:42:56.792110 4553 generic.go:334] "Generic (PLEG): container finished" podID="04da03ab-c17c-40e3-ab34-524cda37de29" containerID="1c8c4f3ba4f9b04a183efce4087f34cd82e39457d03a3817e064cf3f7586dbc9" exitCode=0 Sep 30 19:42:56 crc kubenswrapper[4553]: I0930 19:42:56.792588 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" event={"ID":"04da03ab-c17c-40e3-ab34-524cda37de29","Type":"ContainerDied","Data":"1c8c4f3ba4f9b04a183efce4087f34cd82e39457d03a3817e064cf3f7586dbc9"} Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.205331 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.289844 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-bundle\") pod \"04da03ab-c17c-40e3-ab34-524cda37de29\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.289897 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnsbl\" (UniqueName: \"kubernetes.io/projected/04da03ab-c17c-40e3-ab34-524cda37de29-kube-api-access-gnsbl\") pod \"04da03ab-c17c-40e3-ab34-524cda37de29\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.289920 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-util\") pod \"04da03ab-c17c-40e3-ab34-524cda37de29\" (UID: \"04da03ab-c17c-40e3-ab34-524cda37de29\") " Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.291709 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-bundle" (OuterVolumeSpecName: "bundle") pod "04da03ab-c17c-40e3-ab34-524cda37de29" (UID: "04da03ab-c17c-40e3-ab34-524cda37de29"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.292343 4553 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.295272 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04da03ab-c17c-40e3-ab34-524cda37de29-kube-api-access-gnsbl" (OuterVolumeSpecName: "kube-api-access-gnsbl") pod "04da03ab-c17c-40e3-ab34-524cda37de29" (UID: "04da03ab-c17c-40e3-ab34-524cda37de29"). InnerVolumeSpecName "kube-api-access-gnsbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.304096 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-util" (OuterVolumeSpecName: "util") pod "04da03ab-c17c-40e3-ab34-524cda37de29" (UID: "04da03ab-c17c-40e3-ab34-524cda37de29"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.393507 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnsbl\" (UniqueName: \"kubernetes.io/projected/04da03ab-c17c-40e3-ab34-524cda37de29-kube-api-access-gnsbl\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.393557 4553 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04da03ab-c17c-40e3-ab34-524cda37de29-util\") on node \"crc\" DevicePath \"\"" Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.809857 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" event={"ID":"04da03ab-c17c-40e3-ab34-524cda37de29","Type":"ContainerDied","Data":"d60a231d4b4113b7f19639fc736675d470fa4d40f9872bc919fd8d3b5354043d"} Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.809918 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d60a231d4b4113b7f19639fc736675d470fa4d40f9872bc919fd8d3b5354043d" Sep 30 19:42:58 crc kubenswrapper[4553]: I0930 19:42:58.809947 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.320989 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp"] Sep 30 19:43:00 crc kubenswrapper[4553]: E0930 19:43:00.321398 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04da03ab-c17c-40e3-ab34-524cda37de29" containerName="pull" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.321411 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="04da03ab-c17c-40e3-ab34-524cda37de29" containerName="pull" Sep 30 19:43:00 crc kubenswrapper[4553]: E0930 19:43:00.321423 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04da03ab-c17c-40e3-ab34-524cda37de29" containerName="util" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.321429 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="04da03ab-c17c-40e3-ab34-524cda37de29" containerName="util" Sep 30 19:43:00 crc kubenswrapper[4553]: E0930 19:43:00.321447 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04da03ab-c17c-40e3-ab34-524cda37de29" containerName="extract" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.321454 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="04da03ab-c17c-40e3-ab34-524cda37de29" containerName="extract" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.321539 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="04da03ab-c17c-40e3-ab34-524cda37de29" containerName="extract" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.321985 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.324155 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-dmv2q" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.324719 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.325236 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.344427 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp"] Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.481708 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc7h5\" (UniqueName: \"kubernetes.io/projected/c549e241-2eff-4e22-8d14-fb0d64873ac2-kube-api-access-qc7h5\") pod \"nmstate-operator-5d6f6cfd66-xxfdp\" (UID: \"c549e241-2eff-4e22-8d14-fb0d64873ac2\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.582458 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc7h5\" (UniqueName: \"kubernetes.io/projected/c549e241-2eff-4e22-8d14-fb0d64873ac2-kube-api-access-qc7h5\") pod \"nmstate-operator-5d6f6cfd66-xxfdp\" (UID: \"c549e241-2eff-4e22-8d14-fb0d64873ac2\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.607817 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc7h5\" (UniqueName: \"kubernetes.io/projected/c549e241-2eff-4e22-8d14-fb0d64873ac2-kube-api-access-qc7h5\") pod \"nmstate-operator-5d6f6cfd66-xxfdp\" (UID: \"c549e241-2eff-4e22-8d14-fb0d64873ac2\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.637844 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp" Sep 30 19:43:00 crc kubenswrapper[4553]: I0930 19:43:00.847509 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp"] Sep 30 19:43:00 crc kubenswrapper[4553]: W0930 19:43:00.859943 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc549e241_2eff_4e22_8d14_fb0d64873ac2.slice/crio-b2e47c3f2d0a48f025b7f72a1bb9d4feb962025dfc153cad145d3fa049ff53bf WatchSource:0}: Error finding container b2e47c3f2d0a48f025b7f72a1bb9d4feb962025dfc153cad145d3fa049ff53bf: Status 404 returned error can't find the container with id b2e47c3f2d0a48f025b7f72a1bb9d4feb962025dfc153cad145d3fa049ff53bf Sep 30 19:43:01 crc kubenswrapper[4553]: I0930 19:43:01.826488 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp" event={"ID":"c549e241-2eff-4e22-8d14-fb0d64873ac2","Type":"ContainerStarted","Data":"b2e47c3f2d0a48f025b7f72a1bb9d4feb962025dfc153cad145d3fa049ff53bf"} Sep 30 19:43:03 crc kubenswrapper[4553]: I0930 19:43:03.838830 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp" event={"ID":"c549e241-2eff-4e22-8d14-fb0d64873ac2","Type":"ContainerStarted","Data":"ef95283db9e24bf4a38294f3ecfd91cedf35e41c1acf78d242fcdf1b1bea0c6a"} Sep 30 19:43:03 crc kubenswrapper[4553]: I0930 19:43:03.855500 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-xxfdp" podStartSLOduration=1.431528461 podStartE2EDuration="3.855481806s" podCreationTimestamp="2025-09-30 19:43:00 +0000 UTC" firstStartedPulling="2025-09-30 19:43:00.861965042 +0000 UTC m=+634.061467172" lastFinishedPulling="2025-09-30 19:43:03.285918377 +0000 UTC m=+636.485420517" observedRunningTime="2025-09-30 19:43:03.854786828 +0000 UTC m=+637.054288968" watchObservedRunningTime="2025-09-30 19:43:03.855481806 +0000 UTC m=+637.054983946" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.746905 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s"] Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.747742 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.750099 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bdvc5" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.756634 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s"] Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.782105 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-4h579"] Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.782845 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.786729 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.802695 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-cgzk9"] Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.803342 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.823794 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-4h579"] Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.838441 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-nmstate-lock\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.838502 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2n5l\" (UniqueName: \"kubernetes.io/projected/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-kube-api-access-h2n5l\") pod \"nmstate-webhook-6d689559c5-4h579\" (UID: \"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.838524 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dndqk\" (UniqueName: \"kubernetes.io/projected/c3247d9f-1127-4fa1-9124-815ab13ffadb-kube-api-access-dndqk\") pod \"nmstate-metrics-58fcddf996-b2m5s\" (UID: \"c3247d9f-1127-4fa1-9124-815ab13ffadb\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.838576 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-4h579\" (UID: \"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.838595 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-dbus-socket\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.838616 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-ovs-socket\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.838631 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2k4z\" (UniqueName: \"kubernetes.io/projected/18f2ad62-a815-4e7b-95ad-eb0f99c24665-kube-api-access-f2k4z\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.920936 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w"] Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.921741 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.924476 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.924666 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xnxsp" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.924728 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.936154 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w"] Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939570 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-dbus-socket\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939607 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-ovs-socket\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939630 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2k4z\" (UniqueName: \"kubernetes.io/projected/18f2ad62-a815-4e7b-95ad-eb0f99c24665-kube-api-access-f2k4z\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939649 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f94ac837-cb01-45f9-a065-76734fc913c1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939675 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-nmstate-lock\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939702 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2n5l\" (UniqueName: \"kubernetes.io/projected/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-kube-api-access-h2n5l\") pod \"nmstate-webhook-6d689559c5-4h579\" (UID: \"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939718 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7264\" (UniqueName: \"kubernetes.io/projected/f94ac837-cb01-45f9-a065-76734fc913c1-kube-api-access-g7264\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939738 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dndqk\" (UniqueName: \"kubernetes.io/projected/c3247d9f-1127-4fa1-9124-815ab13ffadb-kube-api-access-dndqk\") pod \"nmstate-metrics-58fcddf996-b2m5s\" (UID: \"c3247d9f-1127-4fa1-9124-815ab13ffadb\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939797 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f94ac837-cb01-45f9-a065-76734fc913c1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939817 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-4h579\" (UID: \"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:04 crc kubenswrapper[4553]: E0930 19:43:04.939927 4553 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.939953 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-nmstate-lock\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: E0930 19:43:04.939975 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-tls-key-pair podName:6ccd3ce9-0e6b-4142-84c8-ce7fba27a760 nodeName:}" failed. No retries permitted until 2025-09-30 19:43:05.439960174 +0000 UTC m=+638.639462304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-tls-key-pair") pod "nmstate-webhook-6d689559c5-4h579" (UID: "6ccd3ce9-0e6b-4142-84c8-ce7fba27a760") : secret "openshift-nmstate-webhook" not found Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.940326 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-dbus-socket\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.940371 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/18f2ad62-a815-4e7b-95ad-eb0f99c24665-ovs-socket\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.962882 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dndqk\" (UniqueName: \"kubernetes.io/projected/c3247d9f-1127-4fa1-9124-815ab13ffadb-kube-api-access-dndqk\") pod \"nmstate-metrics-58fcddf996-b2m5s\" (UID: \"c3247d9f-1127-4fa1-9124-815ab13ffadb\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.962941 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2n5l\" (UniqueName: \"kubernetes.io/projected/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-kube-api-access-h2n5l\") pod \"nmstate-webhook-6d689559c5-4h579\" (UID: \"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:04 crc kubenswrapper[4553]: I0930 19:43:04.968820 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2k4z\" (UniqueName: \"kubernetes.io/projected/18f2ad62-a815-4e7b-95ad-eb0f99c24665-kube-api-access-f2k4z\") pod \"nmstate-handler-cgzk9\" (UID: \"18f2ad62-a815-4e7b-95ad-eb0f99c24665\") " pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.041446 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f94ac837-cb01-45f9-a065-76734fc913c1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.041536 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f94ac837-cb01-45f9-a065-76734fc913c1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.041588 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7264\" (UniqueName: \"kubernetes.io/projected/f94ac837-cb01-45f9-a065-76734fc913c1-kube-api-access-g7264\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:05 crc kubenswrapper[4553]: E0930 19:43:05.041691 4553 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 30 19:43:05 crc kubenswrapper[4553]: E0930 19:43:05.041773 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f94ac837-cb01-45f9-a065-76734fc913c1-plugin-serving-cert podName:f94ac837-cb01-45f9-a065-76734fc913c1 nodeName:}" failed. No retries permitted until 2025-09-30 19:43:05.541751751 +0000 UTC m=+638.741253971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/f94ac837-cb01-45f9-a065-76734fc913c1-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-4bh2w" (UID: "f94ac837-cb01-45f9-a065-76734fc913c1") : secret "plugin-serving-cert" not found Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.042366 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f94ac837-cb01-45f9-a065-76734fc913c1-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.062059 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.068573 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7264\" (UniqueName: \"kubernetes.io/projected/f94ac837-cb01-45f9-a065-76734fc913c1-kube-api-access-g7264\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.129708 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.149138 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c6c76ddf7-8m48x"] Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.149969 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: W0930 19:43:05.159200 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18f2ad62_a815_4e7b_95ad_eb0f99c24665.slice/crio-900d534015f4bdc59d7244726a98b1dc2dd0a8f1653869063316435ab3f131f6 WatchSource:0}: Error finding container 900d534015f4bdc59d7244726a98b1dc2dd0a8f1653869063316435ab3f131f6: Status 404 returned error can't find the container with id 900d534015f4bdc59d7244726a98b1dc2dd0a8f1653869063316435ab3f131f6 Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.187093 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6c76ddf7-8m48x"] Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.244301 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-oauth-serving-cert\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.244348 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-oauth-config\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.244371 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-service-ca\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.244408 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpkkb\" (UniqueName: \"kubernetes.io/projected/122fddec-4447-4c99-b1f1-c9b6fcaad117-kube-api-access-zpkkb\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.244437 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-config\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.244478 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-trusted-ca-bundle\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.244497 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-serving-cert\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.345576 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-trusted-ca-bundle\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.345623 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-serving-cert\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.345644 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-oauth-serving-cert\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.345666 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-oauth-config\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.345686 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-service-ca\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.345729 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpkkb\" (UniqueName: \"kubernetes.io/projected/122fddec-4447-4c99-b1f1-c9b6fcaad117-kube-api-access-zpkkb\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.345761 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-config\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.346682 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-oauth-serving-cert\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.346711 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-service-ca\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.346903 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-config\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.347711 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/122fddec-4447-4c99-b1f1-c9b6fcaad117-trusted-ca-bundle\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.351675 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-oauth-config\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.351885 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/122fddec-4447-4c99-b1f1-c9b6fcaad117-console-serving-cert\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.362186 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpkkb\" (UniqueName: \"kubernetes.io/projected/122fddec-4447-4c99-b1f1-c9b6fcaad117-kube-api-access-zpkkb\") pod \"console-c6c76ddf7-8m48x\" (UID: \"122fddec-4447-4c99-b1f1-c9b6fcaad117\") " pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.447078 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-4h579\" (UID: \"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.450823 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6ccd3ce9-0e6b-4142-84c8-ce7fba27a760-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-4h579\" (UID: \"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.468438 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.548129 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f94ac837-cb01-45f9-a065-76734fc913c1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.552656 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f94ac837-cb01-45f9-a065-76734fc913c1-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-4bh2w\" (UID: \"f94ac837-cb01-45f9-a065-76734fc913c1\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.581888 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s"] Sep 30 19:43:05 crc kubenswrapper[4553]: W0930 19:43:05.595923 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3247d9f_1127_4fa1_9124_815ab13ffadb.slice/crio-a9f78405ecbfc95cb09737acf763e84eb44efd3ffda1a09724ba6e0473dfd46b WatchSource:0}: Error finding container a9f78405ecbfc95cb09737acf763e84eb44efd3ffda1a09724ba6e0473dfd46b: Status 404 returned error can't find the container with id a9f78405ecbfc95cb09737acf763e84eb44efd3ffda1a09724ba6e0473dfd46b Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.696512 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.839285 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.851144 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cgzk9" event={"ID":"18f2ad62-a815-4e7b-95ad-eb0f99c24665","Type":"ContainerStarted","Data":"900d534015f4bdc59d7244726a98b1dc2dd0a8f1653869063316435ab3f131f6"} Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.852261 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" event={"ID":"c3247d9f-1127-4fa1-9124-815ab13ffadb","Type":"ContainerStarted","Data":"a9f78405ecbfc95cb09737acf763e84eb44efd3ffda1a09724ba6e0473dfd46b"} Sep 30 19:43:05 crc kubenswrapper[4553]: I0930 19:43:05.898296 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6c76ddf7-8m48x"] Sep 30 19:43:06 crc kubenswrapper[4553]: I0930 19:43:06.084191 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-4h579"] Sep 30 19:43:06 crc kubenswrapper[4553]: I0930 19:43:06.208409 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w"] Sep 30 19:43:06 crc kubenswrapper[4553]: W0930 19:43:06.216263 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf94ac837_cb01_45f9_a065_76734fc913c1.slice/crio-79e5c960b16b6a8e15164c0807856d637d2d4e6d7250a72b4f931eaebd8e3eec WatchSource:0}: Error finding container 79e5c960b16b6a8e15164c0807856d637d2d4e6d7250a72b4f931eaebd8e3eec: Status 404 returned error can't find the container with id 79e5c960b16b6a8e15164c0807856d637d2d4e6d7250a72b4f931eaebd8e3eec Sep 30 19:43:06 crc kubenswrapper[4553]: I0930 19:43:06.866499 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6c76ddf7-8m48x" event={"ID":"122fddec-4447-4c99-b1f1-c9b6fcaad117","Type":"ContainerStarted","Data":"32b1e35bf2be08c2099b990e31a35fb2508780a6620c563fba2603ccd916c7ea"} Sep 30 19:43:06 crc kubenswrapper[4553]: I0930 19:43:06.866886 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6c76ddf7-8m48x" event={"ID":"122fddec-4447-4c99-b1f1-c9b6fcaad117","Type":"ContainerStarted","Data":"d4f74da341507f32f4171b2dba3ce1d6bc91409af21a7881d1b458ebed58c9ec"} Sep 30 19:43:06 crc kubenswrapper[4553]: I0930 19:43:06.873804 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" event={"ID":"f94ac837-cb01-45f9-a065-76734fc913c1","Type":"ContainerStarted","Data":"79e5c960b16b6a8e15164c0807856d637d2d4e6d7250a72b4f931eaebd8e3eec"} Sep 30 19:43:06 crc kubenswrapper[4553]: I0930 19:43:06.877520 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" event={"ID":"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760","Type":"ContainerStarted","Data":"c35bc4efd9f2621da0485e0c03cdee5b7249781ea0fcb282f3698fe83f63bdf0"} Sep 30 19:43:06 crc kubenswrapper[4553]: I0930 19:43:06.907794 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c6c76ddf7-8m48x" podStartSLOduration=1.907758352 podStartE2EDuration="1.907758352s" podCreationTimestamp="2025-09-30 19:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:43:06.896745541 +0000 UTC m=+640.096247691" watchObservedRunningTime="2025-09-30 19:43:06.907758352 +0000 UTC m=+640.107260492" Sep 30 19:43:08 crc kubenswrapper[4553]: I0930 19:43:08.893832 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" event={"ID":"c3247d9f-1127-4fa1-9124-815ab13ffadb","Type":"ContainerStarted","Data":"38e3ba81a6c2e2d7715282b63f76b12d47a504202aeae8ba8244d60c386932e1"} Sep 30 19:43:08 crc kubenswrapper[4553]: I0930 19:43:08.895675 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" event={"ID":"6ccd3ce9-0e6b-4142-84c8-ce7fba27a760","Type":"ContainerStarted","Data":"d66af8878a96e0310a27c3a49a94a0049efb787609efd87385cdf9174e0250ad"} Sep 30 19:43:08 crc kubenswrapper[4553]: I0930 19:43:08.895752 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:08 crc kubenswrapper[4553]: I0930 19:43:08.902893 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-cgzk9" event={"ID":"18f2ad62-a815-4e7b-95ad-eb0f99c24665","Type":"ContainerStarted","Data":"b711054e1b4b0910584629523d7e61bbc4d27ae376e0078b2a96756ed2011753"} Sep 30 19:43:08 crc kubenswrapper[4553]: I0930 19:43:08.903182 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:08 crc kubenswrapper[4553]: I0930 19:43:08.915363 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" podStartSLOduration=2.733266394 podStartE2EDuration="4.915337449s" podCreationTimestamp="2025-09-30 19:43:04 +0000 UTC" firstStartedPulling="2025-09-30 19:43:06.089692597 +0000 UTC m=+639.289194727" lastFinishedPulling="2025-09-30 19:43:08.271763632 +0000 UTC m=+641.471265782" observedRunningTime="2025-09-30 19:43:08.911568879 +0000 UTC m=+642.111071019" watchObservedRunningTime="2025-09-30 19:43:08.915337449 +0000 UTC m=+642.114839579" Sep 30 19:43:10 crc kubenswrapper[4553]: I0930 19:43:10.915519 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" event={"ID":"f94ac837-cb01-45f9-a065-76734fc913c1","Type":"ContainerStarted","Data":"dd18cc33b717772b9599114a1808746544adfb577ff06973051e97f6ff3a85be"} Sep 30 19:43:10 crc kubenswrapper[4553]: I0930 19:43:10.943011 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-cgzk9" podStartSLOduration=3.8199799309999998 podStartE2EDuration="6.942994788s" podCreationTimestamp="2025-09-30 19:43:04 +0000 UTC" firstStartedPulling="2025-09-30 19:43:05.166725962 +0000 UTC m=+638.366228092" lastFinishedPulling="2025-09-30 19:43:08.289740809 +0000 UTC m=+641.489242949" observedRunningTime="2025-09-30 19:43:08.957193981 +0000 UTC m=+642.156696121" watchObservedRunningTime="2025-09-30 19:43:10.942994788 +0000 UTC m=+644.142496918" Sep 30 19:43:10 crc kubenswrapper[4553]: I0930 19:43:10.945082 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-4bh2w" podStartSLOduration=3.094420445 podStartE2EDuration="6.945075764s" podCreationTimestamp="2025-09-30 19:43:04 +0000 UTC" firstStartedPulling="2025-09-30 19:43:06.217904525 +0000 UTC m=+639.417406655" lastFinishedPulling="2025-09-30 19:43:10.068559844 +0000 UTC m=+643.268061974" observedRunningTime="2025-09-30 19:43:10.943088522 +0000 UTC m=+644.142590712" watchObservedRunningTime="2025-09-30 19:43:10.945075764 +0000 UTC m=+644.144577884" Sep 30 19:43:11 crc kubenswrapper[4553]: I0930 19:43:11.924959 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" event={"ID":"c3247d9f-1127-4fa1-9124-815ab13ffadb","Type":"ContainerStarted","Data":"c56195962a563d50637ce073737201c0dd3b1f4c05821b5c8eea68733193c440"} Sep 30 19:43:11 crc kubenswrapper[4553]: I0930 19:43:11.954124 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-b2m5s" podStartSLOduration=2.363000542 podStartE2EDuration="7.954102506s" podCreationTimestamp="2025-09-30 19:43:04 +0000 UTC" firstStartedPulling="2025-09-30 19:43:05.600716429 +0000 UTC m=+638.800218559" lastFinishedPulling="2025-09-30 19:43:11.191818363 +0000 UTC m=+644.391320523" observedRunningTime="2025-09-30 19:43:11.948501788 +0000 UTC m=+645.148003978" watchObservedRunningTime="2025-09-30 19:43:11.954102506 +0000 UTC m=+645.153604646" Sep 30 19:43:15 crc kubenswrapper[4553]: I0930 19:43:15.166388 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-cgzk9" Sep 30 19:43:15 crc kubenswrapper[4553]: I0930 19:43:15.468826 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:15 crc kubenswrapper[4553]: I0930 19:43:15.468911 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:15 crc kubenswrapper[4553]: I0930 19:43:15.480493 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:15 crc kubenswrapper[4553]: I0930 19:43:15.958281 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c6c76ddf7-8m48x" Sep 30 19:43:16 crc kubenswrapper[4553]: I0930 19:43:16.030688 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6csmn"] Sep 30 19:43:25 crc kubenswrapper[4553]: I0930 19:43:25.706742 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-4h579" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.093798 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-6csmn" podUID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" containerName="console" containerID="cri-o://d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2" gracePeriod=15 Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.480636 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff"] Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.481824 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.488845 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.499701 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff"] Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.510665 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.510723 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.510750 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh54z\" (UniqueName: \"kubernetes.io/projected/841a6dbb-567d-429f-9096-23b69f7b9e5f-kube-api-access-kh54z\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.611886 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.611955 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.611985 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh54z\" (UniqueName: \"kubernetes.io/projected/841a6dbb-567d-429f-9096-23b69f7b9e5f-kube-api-access-kh54z\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.612600 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.612649 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.642926 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh54z\" (UniqueName: \"kubernetes.io/projected/841a6dbb-567d-429f-9096-23b69f7b9e5f-kube-api-access-kh54z\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.687964 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6csmn_7cbc3e79-bfd5-4b89-9e32-bd92d2700f74/console/0.log" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.688024 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.713803 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gbgc\" (UniqueName: \"kubernetes.io/projected/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-kube-api-access-6gbgc\") pod \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.713884 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-serving-cert\") pod \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.713910 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-oauth-serving-cert\") pod \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.713931 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-trusted-ca-bundle\") pod \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.713962 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-oauth-config\") pod \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.714001 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-config\") pod \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.714056 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-service-ca\") pod \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\" (UID: \"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74\") " Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.715108 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-service-ca" (OuterVolumeSpecName: "service-ca") pod "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" (UID: "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.715216 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" (UID: "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.715239 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" (UID: "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.718413 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" (UID: "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.732172 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-config" (OuterVolumeSpecName: "console-config") pod "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" (UID: "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.735171 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" (UID: "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.737501 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-kube-api-access-6gbgc" (OuterVolumeSpecName: "kube-api-access-6gbgc") pod "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" (UID: "7cbc3e79-bfd5-4b89-9e32-bd92d2700f74"). InnerVolumeSpecName "kube-api-access-6gbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.815375 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.815427 4553 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.815450 4553 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.815460 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gbgc\" (UniqueName: \"kubernetes.io/projected/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-kube-api-access-6gbgc\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.815470 4553 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.815479 4553 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.815487 4553 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:41 crc kubenswrapper[4553]: I0930 19:43:41.815494 4553 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.045899 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff"] Sep 30 19:43:42 crc kubenswrapper[4553]: W0930 19:43:42.056066 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod841a6dbb_567d_429f_9096_23b69f7b9e5f.slice/crio-fee6037173ff4dc87a21206b2184027574e03085070b30061864989e8c227875 WatchSource:0}: Error finding container fee6037173ff4dc87a21206b2184027574e03085070b30061864989e8c227875: Status 404 returned error can't find the container with id fee6037173ff4dc87a21206b2184027574e03085070b30061864989e8c227875 Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.141061 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" event={"ID":"841a6dbb-567d-429f-9096-23b69f7b9e5f","Type":"ContainerStarted","Data":"fee6037173ff4dc87a21206b2184027574e03085070b30061864989e8c227875"} Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.142672 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6csmn_7cbc3e79-bfd5-4b89-9e32-bd92d2700f74/console/0.log" Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.142714 4553 generic.go:334] "Generic (PLEG): container finished" podID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" containerID="d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2" exitCode=2 Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.142737 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6csmn" event={"ID":"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74","Type":"ContainerDied","Data":"d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2"} Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.142756 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6csmn" event={"ID":"7cbc3e79-bfd5-4b89-9e32-bd92d2700f74","Type":"ContainerDied","Data":"2d851c626bb46043b553592433c219a24a04a7d07d41202b56bda105435b794d"} Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.142775 4553 scope.go:117] "RemoveContainer" containerID="d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2" Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.142894 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6csmn" Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.158178 4553 scope.go:117] "RemoveContainer" containerID="d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2" Sep 30 19:43:42 crc kubenswrapper[4553]: E0930 19:43:42.158609 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2\": container with ID starting with d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2 not found: ID does not exist" containerID="d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2" Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.158645 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2"} err="failed to get container status \"d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2\": rpc error: code = NotFound desc = could not find container \"d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2\": container with ID starting with d12d09639e821a5b5300bead7b9ff300c154e5224ee6024d82571cec8f826ab2 not found: ID does not exist" Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.176355 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6csmn"] Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.182488 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-6csmn"] Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.648866 4553 patch_prober.go:28] interesting pod/console-f9d7485db-6csmn container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 19:43:42 crc kubenswrapper[4553]: I0930 19:43:42.648947 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-6csmn" podUID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 19:43:43 crc kubenswrapper[4553]: I0930 19:43:43.159775 4553 generic.go:334] "Generic (PLEG): container finished" podID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerID="a5c364d74d9aed766f601664da8c015e2fedb92ec96418e05d89268885a1f95e" exitCode=0 Sep 30 19:43:43 crc kubenswrapper[4553]: I0930 19:43:43.160109 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" event={"ID":"841a6dbb-567d-429f-9096-23b69f7b9e5f","Type":"ContainerDied","Data":"a5c364d74d9aed766f601664da8c015e2fedb92ec96418e05d89268885a1f95e"} Sep 30 19:43:43 crc kubenswrapper[4553]: I0930 19:43:43.536623 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" path="/var/lib/kubelet/pods/7cbc3e79-bfd5-4b89-9e32-bd92d2700f74/volumes" Sep 30 19:43:45 crc kubenswrapper[4553]: I0930 19:43:45.178779 4553 generic.go:334] "Generic (PLEG): container finished" podID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerID="99e3bd94f9d86a12fea6dc20c75426078d7b332ae7ff46d56015e23420494ea5" exitCode=0 Sep 30 19:43:45 crc kubenswrapper[4553]: I0930 19:43:45.178918 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" event={"ID":"841a6dbb-567d-429f-9096-23b69f7b9e5f","Type":"ContainerDied","Data":"99e3bd94f9d86a12fea6dc20c75426078d7b332ae7ff46d56015e23420494ea5"} Sep 30 19:43:46 crc kubenswrapper[4553]: I0930 19:43:46.187238 4553 generic.go:334] "Generic (PLEG): container finished" podID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerID="5f7c96ad69ce85be6062fc762e671075277c76e01e78b996c32fb75856c0acb8" exitCode=0 Sep 30 19:43:46 crc kubenswrapper[4553]: I0930 19:43:46.187326 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" event={"ID":"841a6dbb-567d-429f-9096-23b69f7b9e5f","Type":"ContainerDied","Data":"5f7c96ad69ce85be6062fc762e671075277c76e01e78b996c32fb75856c0acb8"} Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.514504 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.646482 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh54z\" (UniqueName: \"kubernetes.io/projected/841a6dbb-567d-429f-9096-23b69f7b9e5f-kube-api-access-kh54z\") pod \"841a6dbb-567d-429f-9096-23b69f7b9e5f\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.646563 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-bundle\") pod \"841a6dbb-567d-429f-9096-23b69f7b9e5f\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.646631 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-util\") pod \"841a6dbb-567d-429f-9096-23b69f7b9e5f\" (UID: \"841a6dbb-567d-429f-9096-23b69f7b9e5f\") " Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.649241 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-bundle" (OuterVolumeSpecName: "bundle") pod "841a6dbb-567d-429f-9096-23b69f7b9e5f" (UID: "841a6dbb-567d-429f-9096-23b69f7b9e5f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.656751 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841a6dbb-567d-429f-9096-23b69f7b9e5f-kube-api-access-kh54z" (OuterVolumeSpecName: "kube-api-access-kh54z") pod "841a6dbb-567d-429f-9096-23b69f7b9e5f" (UID: "841a6dbb-567d-429f-9096-23b69f7b9e5f"). InnerVolumeSpecName "kube-api-access-kh54z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.669659 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-util" (OuterVolumeSpecName: "util") pod "841a6dbb-567d-429f-9096-23b69f7b9e5f" (UID: "841a6dbb-567d-429f-9096-23b69f7b9e5f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.749209 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh54z\" (UniqueName: \"kubernetes.io/projected/841a6dbb-567d-429f-9096-23b69f7b9e5f-kube-api-access-kh54z\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.749252 4553 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:47 crc kubenswrapper[4553]: I0930 19:43:47.749269 4553 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/841a6dbb-567d-429f-9096-23b69f7b9e5f-util\") on node \"crc\" DevicePath \"\"" Sep 30 19:43:48 crc kubenswrapper[4553]: I0930 19:43:48.205769 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" event={"ID":"841a6dbb-567d-429f-9096-23b69f7b9e5f","Type":"ContainerDied","Data":"fee6037173ff4dc87a21206b2184027574e03085070b30061864989e8c227875"} Sep 30 19:43:48 crc kubenswrapper[4553]: I0930 19:43:48.206280 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fee6037173ff4dc87a21206b2184027574e03085070b30061864989e8c227875" Sep 30 19:43:48 crc kubenswrapper[4553]: I0930 19:43:48.205834 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.911967 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6"] Sep 30 19:43:56 crc kubenswrapper[4553]: E0930 19:43:56.912662 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerName="util" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.912674 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerName="util" Sep 30 19:43:56 crc kubenswrapper[4553]: E0930 19:43:56.912684 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerName="extract" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.912689 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerName="extract" Sep 30 19:43:56 crc kubenswrapper[4553]: E0930 19:43:56.912696 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" containerName="console" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.912702 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" containerName="console" Sep 30 19:43:56 crc kubenswrapper[4553]: E0930 19:43:56.912721 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerName="pull" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.912727 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerName="pull" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.912819 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbc3e79-bfd5-4b89-9e32-bd92d2700f74" containerName="console" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.912831 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="841a6dbb-567d-429f-9096-23b69f7b9e5f" containerName="extract" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.913214 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.923958 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.925131 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.925709 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.925750 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.924023 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fh895" Sep 30 19:43:56 crc kubenswrapper[4553]: I0930 19:43:56.979017 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6"] Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.065996 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e7636d-a087-4849-9440-0095096c8022-webhook-cert\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.066041 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e7636d-a087-4849-9440-0095096c8022-apiservice-cert\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.066074 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgphm\" (UniqueName: \"kubernetes.io/projected/b2e7636d-a087-4849-9440-0095096c8022-kube-api-access-zgphm\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.167194 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e7636d-a087-4849-9440-0095096c8022-webhook-cert\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.167238 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e7636d-a087-4849-9440-0095096c8022-apiservice-cert\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.167256 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgphm\" (UniqueName: \"kubernetes.io/projected/b2e7636d-a087-4849-9440-0095096c8022-kube-api-access-zgphm\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.167968 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh"] Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.168643 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.174681 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.174714 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xt48n" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.174974 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.178932 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b2e7636d-a087-4849-9440-0095096c8022-apiservice-cert\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.187200 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgphm\" (UniqueName: \"kubernetes.io/projected/b2e7636d-a087-4849-9440-0095096c8022-kube-api-access-zgphm\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.188652 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b2e7636d-a087-4849-9440-0095096c8022-webhook-cert\") pod \"metallb-operator-controller-manager-6dc6c6544f-hp9t6\" (UID: \"b2e7636d-a087-4849-9440-0095096c8022\") " pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.191050 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh"] Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.264490 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.369553 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz7d9\" (UniqueName: \"kubernetes.io/projected/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-kube-api-access-xz7d9\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.369862 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-apiservice-cert\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.370007 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-webhook-cert\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.470700 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-webhook-cert\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.471015 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz7d9\" (UniqueName: \"kubernetes.io/projected/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-kube-api-access-xz7d9\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.471135 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-apiservice-cert\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.475725 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-apiservice-cert\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.484930 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-webhook-cert\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.492449 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz7d9\" (UniqueName: \"kubernetes.io/projected/c6c9439c-02e4-4e1d-8eca-27dfc7b0b127-kube-api-access-xz7d9\") pod \"metallb-operator-webhook-server-845c9f75c7-nj9qh\" (UID: \"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127\") " pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.524663 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:43:57 crc kubenswrapper[4553]: I0930 19:43:57.793399 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6"] Sep 30 19:43:58 crc kubenswrapper[4553]: I0930 19:43:58.097341 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh"] Sep 30 19:43:58 crc kubenswrapper[4553]: W0930 19:43:58.102750 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6c9439c_02e4_4e1d_8eca_27dfc7b0b127.slice/crio-d1653c7db9e991de23d41d53f709f125a4787125431dbdb4be12304a5e81d8b6 WatchSource:0}: Error finding container d1653c7db9e991de23d41d53f709f125a4787125431dbdb4be12304a5e81d8b6: Status 404 returned error can't find the container with id d1653c7db9e991de23d41d53f709f125a4787125431dbdb4be12304a5e81d8b6 Sep 30 19:43:58 crc kubenswrapper[4553]: I0930 19:43:58.259448 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" event={"ID":"b2e7636d-a087-4849-9440-0095096c8022","Type":"ContainerStarted","Data":"a9064d9c10f4a0815b34ee3f86a0b69c327b5eb24d25d787ef679714bf43056c"} Sep 30 19:43:58 crc kubenswrapper[4553]: I0930 19:43:58.261026 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" event={"ID":"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127","Type":"ContainerStarted","Data":"d1653c7db9e991de23d41d53f709f125a4787125431dbdb4be12304a5e81d8b6"} Sep 30 19:44:04 crc kubenswrapper[4553]: I0930 19:44:04.306250 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" event={"ID":"b2e7636d-a087-4849-9440-0095096c8022","Type":"ContainerStarted","Data":"2c21e7c7a5675dafd02e3a4569d07f30a22bcfd2d1f681fda6d3100e4f40de0d"} Sep 30 19:44:04 crc kubenswrapper[4553]: I0930 19:44:04.307199 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:44:04 crc kubenswrapper[4553]: I0930 19:44:04.308956 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" event={"ID":"c6c9439c-02e4-4e1d-8eca-27dfc7b0b127","Type":"ContainerStarted","Data":"172bfd5a1058e6fe23c355a756d02dea4de25b90bce38d52b526be3ad6505be6"} Sep 30 19:44:04 crc kubenswrapper[4553]: I0930 19:44:04.309324 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:44:04 crc kubenswrapper[4553]: I0930 19:44:04.339599 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" podStartSLOduration=2.453285928 podStartE2EDuration="8.339579309s" podCreationTimestamp="2025-09-30 19:43:56 +0000 UTC" firstStartedPulling="2025-09-30 19:43:57.807867542 +0000 UTC m=+691.007369662" lastFinishedPulling="2025-09-30 19:44:03.694160913 +0000 UTC m=+696.893663043" observedRunningTime="2025-09-30 19:44:04.336874147 +0000 UTC m=+697.536376327" watchObservedRunningTime="2025-09-30 19:44:04.339579309 +0000 UTC m=+697.539081439" Sep 30 19:44:04 crc kubenswrapper[4553]: I0930 19:44:04.358918 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" podStartSLOduration=1.756496618 podStartE2EDuration="7.358882253s" podCreationTimestamp="2025-09-30 19:43:57 +0000 UTC" firstStartedPulling="2025-09-30 19:43:58.106846729 +0000 UTC m=+691.306348859" lastFinishedPulling="2025-09-30 19:44:03.709232364 +0000 UTC m=+696.908734494" observedRunningTime="2025-09-30 19:44:04.356975051 +0000 UTC m=+697.556477231" watchObservedRunningTime="2025-09-30 19:44:04.358882253 +0000 UTC m=+697.558384423" Sep 30 19:44:17 crc kubenswrapper[4553]: I0930 19:44:17.531013 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-845c9f75c7-nj9qh" Sep 30 19:44:29 crc kubenswrapper[4553]: I0930 19:44:29.585948 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:44:29 crc kubenswrapper[4553]: I0930 19:44:29.586841 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:44:37 crc kubenswrapper[4553]: I0930 19:44:37.267633 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6dc6c6544f-hp9t6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.065339 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-xpxlt"] Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.068304 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.068378 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh"] Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.069148 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.073365 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.073366 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.073507 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-ftbzj" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.084089 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh"] Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.099832 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123213 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-conf\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123251 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-reloader\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123286 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-sockets\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123315 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123342 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics-certs\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123363 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-startup\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123493 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wcdm\" (UniqueName: \"kubernetes.io/projected/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-kube-api-access-5wcdm\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123591 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzcdp\" (UniqueName: \"kubernetes.io/projected/5e06ff00-b19c-4283-baa8-738505ae723f-kube-api-access-wzcdp\") pod \"frr-k8s-webhook-server-5478bdb765-f4rnh\" (UID: \"5e06ff00-b19c-4283-baa8-738505ae723f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.123652 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e06ff00-b19c-4283-baa8-738505ae723f-cert\") pod \"frr-k8s-webhook-server-5478bdb765-f4rnh\" (UID: \"5e06ff00-b19c-4283-baa8-738505ae723f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.155428 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5k2m2"] Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.156402 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.163526 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.163550 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-dx6z9" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.163559 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.163632 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.164804 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-6v9n6"] Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.167309 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.168788 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.173958 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-6v9n6"] Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224598 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-conf\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224634 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-reloader\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224663 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-sockets\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224686 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224711 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics-certs\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224729 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-startup\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224749 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wcdm\" (UniqueName: \"kubernetes.io/projected/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-kube-api-access-5wcdm\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224772 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzcdp\" (UniqueName: \"kubernetes.io/projected/5e06ff00-b19c-4283-baa8-738505ae723f-kube-api-access-wzcdp\") pod \"frr-k8s-webhook-server-5478bdb765-f4rnh\" (UID: \"5e06ff00-b19c-4283-baa8-738505ae723f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.224798 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e06ff00-b19c-4283-baa8-738505ae723f-cert\") pod \"frr-k8s-webhook-server-5478bdb765-f4rnh\" (UID: \"5e06ff00-b19c-4283-baa8-738505ae723f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.224876 4553 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.224907 4553 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.224949 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics-certs podName:ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8 nodeName:}" failed. No retries permitted until 2025-09-30 19:44:38.724929915 +0000 UTC m=+731.924432045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics-certs") pod "frr-k8s-xpxlt" (UID: "ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8") : secret "frr-k8s-certs-secret" not found Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.224965 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e06ff00-b19c-4283-baa8-738505ae723f-cert podName:5e06ff00-b19c-4283-baa8-738505ae723f nodeName:}" failed. No retries permitted until 2025-09-30 19:44:38.724959415 +0000 UTC m=+731.924461545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5e06ff00-b19c-4283-baa8-738505ae723f-cert") pod "frr-k8s-webhook-server-5478bdb765-f4rnh" (UID: "5e06ff00-b19c-4283-baa8-738505ae723f") : secret "frr-k8s-webhook-server-cert" not found Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.225123 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-reloader\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.225221 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-conf\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.225297 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-sockets\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.225304 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.226591 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-frr-startup\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.246818 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wcdm\" (UniqueName: \"kubernetes.io/projected/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-kube-api-access-5wcdm\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.262772 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzcdp\" (UniqueName: \"kubernetes.io/projected/5e06ff00-b19c-4283-baa8-738505ae723f-kube-api-access-wzcdp\") pod \"frr-k8s-webhook-server-5478bdb765-f4rnh\" (UID: \"5e06ff00-b19c-4283-baa8-738505ae723f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.326180 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.326229 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgrbk\" (UniqueName: \"kubernetes.io/projected/c43fd261-7524-4dbc-a909-1bbc73e9f658-kube-api-access-pgrbk\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.326248 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-metrics-certs\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.326817 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-cert\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.326837 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-metrics-certs\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.326883 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c810c77e-e85f-4932-aac6-45dc8419540b-metallb-excludel2\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.326902 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spk5t\" (UniqueName: \"kubernetes.io/projected/c810c77e-e85f-4932-aac6-45dc8419540b-kube-api-access-spk5t\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.427711 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.427767 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgrbk\" (UniqueName: \"kubernetes.io/projected/c43fd261-7524-4dbc-a909-1bbc73e9f658-kube-api-access-pgrbk\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.427787 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-metrics-certs\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.427812 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-cert\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.427830 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-metrics-certs\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.427870 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c810c77e-e85f-4932-aac6-45dc8419540b-metallb-excludel2\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.427889 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spk5t\" (UniqueName: \"kubernetes.io/projected/c810c77e-e85f-4932-aac6-45dc8419540b-kube-api-access-spk5t\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.428253 4553 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.428295 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist podName:c810c77e-e85f-4932-aac6-45dc8419540b nodeName:}" failed. No retries permitted until 2025-09-30 19:44:38.928283554 +0000 UTC m=+732.127785684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist") pod "speaker-5k2m2" (UID: "c810c77e-e85f-4932-aac6-45dc8419540b") : secret "metallb-memberlist" not found Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.428922 4553 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.428961 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-metrics-certs podName:c43fd261-7524-4dbc-a909-1bbc73e9f658 nodeName:}" failed. No retries permitted until 2025-09-30 19:44:38.928950781 +0000 UTC m=+732.128452911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-metrics-certs") pod "controller-5d688f5ffc-6v9n6" (UID: "c43fd261-7524-4dbc-a909-1bbc73e9f658") : secret "controller-certs-secret" not found Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.429670 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c810c77e-e85f-4932-aac6-45dc8419540b-metallb-excludel2\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.430284 4553 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.431588 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-metrics-certs\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.442391 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-cert\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.450808 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgrbk\" (UniqueName: \"kubernetes.io/projected/c43fd261-7524-4dbc-a909-1bbc73e9f658-kube-api-access-pgrbk\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.453547 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spk5t\" (UniqueName: \"kubernetes.io/projected/c810c77e-e85f-4932-aac6-45dc8419540b-kube-api-access-spk5t\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.731659 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e06ff00-b19c-4283-baa8-738505ae723f-cert\") pod \"frr-k8s-webhook-server-5478bdb765-f4rnh\" (UID: \"5e06ff00-b19c-4283-baa8-738505ae723f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.731815 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics-certs\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.735401 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8-metrics-certs\") pod \"frr-k8s-xpxlt\" (UID: \"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8\") " pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.737974 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e06ff00-b19c-4283-baa8-738505ae723f-cert\") pod \"frr-k8s-webhook-server-5478bdb765-f4rnh\" (UID: \"5e06ff00-b19c-4283-baa8-738505ae723f\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.934665 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.934839 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-metrics-certs\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.934851 4553 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 19:44:38 crc kubenswrapper[4553]: E0930 19:44:38.934932 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist podName:c810c77e-e85f-4932-aac6-45dc8419540b nodeName:}" failed. No retries permitted until 2025-09-30 19:44:39.934913479 +0000 UTC m=+733.134415619 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist") pod "speaker-5k2m2" (UID: "c810c77e-e85f-4932-aac6-45dc8419540b") : secret "metallb-memberlist" not found Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.940794 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c43fd261-7524-4dbc-a909-1bbc73e9f658-metrics-certs\") pod \"controller-5d688f5ffc-6v9n6\" (UID: \"c43fd261-7524-4dbc-a909-1bbc73e9f658\") " pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.984695 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:38 crc kubenswrapper[4553]: I0930 19:44:38.996845 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.079153 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.335565 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-6v9n6"] Sep 30 19:44:39 crc kubenswrapper[4553]: W0930 19:44:39.353328 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc43fd261_7524_4dbc_a909_1bbc73e9f658.slice/crio-d999d363b7bf636bd8b52389163861e08bec925cdd129088348f952fef7ce0bf WatchSource:0}: Error finding container d999d363b7bf636bd8b52389163861e08bec925cdd129088348f952fef7ce0bf: Status 404 returned error can't find the container with id d999d363b7bf636bd8b52389163861e08bec925cdd129088348f952fef7ce0bf Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.491933 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh"] Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.560530 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerStarted","Data":"a65f095d0da44d4d1ee196e2e65031f91de8b2ae4da115b41be7cf77c4a37e01"} Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.562991 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-6v9n6" event={"ID":"c43fd261-7524-4dbc-a909-1bbc73e9f658","Type":"ContainerStarted","Data":"5cb759984fdbc70ffc0ad5f0cdd0ac66a33962400ec191230a7439a61eb73cf4"} Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.563014 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-6v9n6" event={"ID":"c43fd261-7524-4dbc-a909-1bbc73e9f658","Type":"ContainerStarted","Data":"d999d363b7bf636bd8b52389163861e08bec925cdd129088348f952fef7ce0bf"} Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.564063 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" event={"ID":"5e06ff00-b19c-4283-baa8-738505ae723f","Type":"ContainerStarted","Data":"a8ee7aa4652d28732ae1b17df53756d611702d0db7a1be3a47991b63d640bbf1"} Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.948589 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.953767 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c810c77e-e85f-4932-aac6-45dc8419540b-memberlist\") pod \"speaker-5k2m2\" (UID: \"c810c77e-e85f-4932-aac6-45dc8419540b\") " pod="metallb-system/speaker-5k2m2" Sep 30 19:44:39 crc kubenswrapper[4553]: I0930 19:44:39.967611 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5k2m2" Sep 30 19:44:39 crc kubenswrapper[4553]: W0930 19:44:39.982735 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc810c77e_e85f_4932_aac6_45dc8419540b.slice/crio-c1371e02401c3b18996a280c42d940a0bfc4d262a7eebac589a095b394fd8874 WatchSource:0}: Error finding container c1371e02401c3b18996a280c42d940a0bfc4d262a7eebac589a095b394fd8874: Status 404 returned error can't find the container with id c1371e02401c3b18996a280c42d940a0bfc4d262a7eebac589a095b394fd8874 Sep 30 19:44:40 crc kubenswrapper[4553]: I0930 19:44:40.581673 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-6v9n6" event={"ID":"c43fd261-7524-4dbc-a909-1bbc73e9f658","Type":"ContainerStarted","Data":"f1c4c0028ce6b34c13e96edf808e6379da5a7c626ce9f95311ba46d166a96b1e"} Sep 30 19:44:40 crc kubenswrapper[4553]: I0930 19:44:40.581809 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:40 crc kubenswrapper[4553]: I0930 19:44:40.586492 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5k2m2" event={"ID":"c810c77e-e85f-4932-aac6-45dc8419540b","Type":"ContainerStarted","Data":"d7dfac5724eb10d558221ff187c1176edd7db178137af76891d532637a213fa8"} Sep 30 19:44:40 crc kubenswrapper[4553]: I0930 19:44:40.586539 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5k2m2" event={"ID":"c810c77e-e85f-4932-aac6-45dc8419540b","Type":"ContainerStarted","Data":"c0a54f0f4eda5f5784bc6de238f0ec8882bebf9ed115d1fc2fe3d445ffda610c"} Sep 30 19:44:40 crc kubenswrapper[4553]: I0930 19:44:40.586554 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5k2m2" event={"ID":"c810c77e-e85f-4932-aac6-45dc8419540b","Type":"ContainerStarted","Data":"c1371e02401c3b18996a280c42d940a0bfc4d262a7eebac589a095b394fd8874"} Sep 30 19:44:40 crc kubenswrapper[4553]: I0930 19:44:40.587112 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5k2m2" Sep 30 19:44:40 crc kubenswrapper[4553]: I0930 19:44:40.618338 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-6v9n6" podStartSLOduration=2.618321598 podStartE2EDuration="2.618321598s" podCreationTimestamp="2025-09-30 19:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:44:40.615084482 +0000 UTC m=+733.814586612" watchObservedRunningTime="2025-09-30 19:44:40.618321598 +0000 UTC m=+733.817823728" Sep 30 19:44:40 crc kubenswrapper[4553]: I0930 19:44:40.647228 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5k2m2" podStartSLOduration=2.647214237 podStartE2EDuration="2.647214237s" podCreationTimestamp="2025-09-30 19:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:44:40.642051252 +0000 UTC m=+733.841553382" watchObservedRunningTime="2025-09-30 19:44:40.647214237 +0000 UTC m=+733.846716367" Sep 30 19:44:47 crc kubenswrapper[4553]: I0930 19:44:47.632327 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" event={"ID":"5e06ff00-b19c-4283-baa8-738505ae723f","Type":"ContainerStarted","Data":"5589dc1a073be0fd5765b460d4a9374adcacfa58fdcfc5f138041d99dbae3884"} Sep 30 19:44:47 crc kubenswrapper[4553]: I0930 19:44:47.633464 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:47 crc kubenswrapper[4553]: I0930 19:44:47.666782 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" podStartSLOduration=1.855889333 podStartE2EDuration="9.666763981s" podCreationTimestamp="2025-09-30 19:44:38 +0000 UTC" firstStartedPulling="2025-09-30 19:44:39.513027865 +0000 UTC m=+732.712529995" lastFinishedPulling="2025-09-30 19:44:47.323902503 +0000 UTC m=+740.523404643" observedRunningTime="2025-09-30 19:44:47.666135674 +0000 UTC m=+740.865637804" watchObservedRunningTime="2025-09-30 19:44:47.666763981 +0000 UTC m=+740.866266111" Sep 30 19:44:48 crc kubenswrapper[4553]: I0930 19:44:48.640645 4553 generic.go:334] "Generic (PLEG): container finished" podID="ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8" containerID="f24bebb6158440b400cd38e76472c24a74c916cdb51ba954fb71f2f18cbeeaa9" exitCode=0 Sep 30 19:44:48 crc kubenswrapper[4553]: I0930 19:44:48.640747 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerDied","Data":"f24bebb6158440b400cd38e76472c24a74c916cdb51ba954fb71f2f18cbeeaa9"} Sep 30 19:44:49 crc kubenswrapper[4553]: I0930 19:44:49.084484 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-6v9n6" Sep 30 19:44:49 crc kubenswrapper[4553]: I0930 19:44:49.647604 4553 generic.go:334] "Generic (PLEG): container finished" podID="ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8" containerID="0c6e5df4b343f4c9e55ad737e91293797bf2a5f0525f85b19ba369a73ec17d08" exitCode=0 Sep 30 19:44:49 crc kubenswrapper[4553]: I0930 19:44:49.647748 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerDied","Data":"0c6e5df4b343f4c9e55ad737e91293797bf2a5f0525f85b19ba369a73ec17d08"} Sep 30 19:44:50 crc kubenswrapper[4553]: I0930 19:44:50.656486 4553 generic.go:334] "Generic (PLEG): container finished" podID="ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8" containerID="4a108b02b49de4329afd76553754d74251f2ccb34b1e79c5aeda0c2d2dae2b42" exitCode=0 Sep 30 19:44:50 crc kubenswrapper[4553]: I0930 19:44:50.656545 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerDied","Data":"4a108b02b49de4329afd76553754d74251f2ccb34b1e79c5aeda0c2d2dae2b42"} Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.426277 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5dq4n"] Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.426853 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" podUID="96bf498c-034c-431c-ae07-4099724a48a7" containerName="controller-manager" containerID="cri-o://729718dc8f3535ff2e09a156c69f54432338cc37d1105fae6534c61833203332" gracePeriod=30 Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.547233 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m"] Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.672904 4553 generic.go:334] "Generic (PLEG): container finished" podID="96bf498c-034c-431c-ae07-4099724a48a7" containerID="729718dc8f3535ff2e09a156c69f54432338cc37d1105fae6534c61833203332" exitCode=0 Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.672979 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" event={"ID":"96bf498c-034c-431c-ae07-4099724a48a7","Type":"ContainerDied","Data":"729718dc8f3535ff2e09a156c69f54432338cc37d1105fae6534c61833203332"} Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.706758 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerStarted","Data":"ab4c5485a1adfc209e74514c97df6231691c3f73a2625327b01dc1ccce6c8487"} Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.707122 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerStarted","Data":"e58a3a01c76ec96e7d26b653f0029d25ef35e2bd76bd9c97d857cd49fcf7a239"} Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.707140 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerStarted","Data":"239c85702a6080128caf3237492911c99fdf5a632aae259245234c4931d50bb5"} Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.707150 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerStarted","Data":"1dc77a45cf521a5df4243aa4105490cab9b8206f37655ee1cba1b877c06b48f7"} Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.707161 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerStarted","Data":"c726fb0a4912dd18e2fe9812396bd61c9078d79f85b740635adee2239827fc47"} Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.706907 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerName="route-controller-manager" containerID="cri-o://626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0" gracePeriod=30 Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.739478 4553 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5dq4n container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.739541 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" podUID="96bf498c-034c-431c-ae07-4099724a48a7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.803827 4553 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6tq2m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Sep 30 19:44:51 crc kubenswrapper[4553]: I0930 19:44:51.803884 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.030466 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.090546 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.138440 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf498c-034c-431c-ae07-4099724a48a7-serving-cert\") pod \"96bf498c-034c-431c-ae07-4099724a48a7\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.138514 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-client-ca\") pod \"96bf498c-034c-431c-ae07-4099724a48a7\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.138544 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-proxy-ca-bundles\") pod \"96bf498c-034c-431c-ae07-4099724a48a7\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.138602 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqw8r\" (UniqueName: \"kubernetes.io/projected/96bf498c-034c-431c-ae07-4099724a48a7-kube-api-access-vqw8r\") pod \"96bf498c-034c-431c-ae07-4099724a48a7\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.138637 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-config\") pod \"96bf498c-034c-431c-ae07-4099724a48a7\" (UID: \"96bf498c-034c-431c-ae07-4099724a48a7\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.139571 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-config" (OuterVolumeSpecName: "config") pod "96bf498c-034c-431c-ae07-4099724a48a7" (UID: "96bf498c-034c-431c-ae07-4099724a48a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.139640 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "96bf498c-034c-431c-ae07-4099724a48a7" (UID: "96bf498c-034c-431c-ae07-4099724a48a7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.139873 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "96bf498c-034c-431c-ae07-4099724a48a7" (UID: "96bf498c-034c-431c-ae07-4099724a48a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.148567 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96bf498c-034c-431c-ae07-4099724a48a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "96bf498c-034c-431c-ae07-4099724a48a7" (UID: "96bf498c-034c-431c-ae07-4099724a48a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.148823 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96bf498c-034c-431c-ae07-4099724a48a7-kube-api-access-vqw8r" (OuterVolumeSpecName: "kube-api-access-vqw8r") pod "96bf498c-034c-431c-ae07-4099724a48a7" (UID: "96bf498c-034c-431c-ae07-4099724a48a7"). InnerVolumeSpecName "kube-api-access-vqw8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.239541 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-config\") pod \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.239965 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-client-ca\") pod \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240056 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbw75\" (UniqueName: \"kubernetes.io/projected/0e392aad-9ae5-4942-a078-8ef9cbaffb90-kube-api-access-rbw75\") pod \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240053 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-config" (OuterVolumeSpecName: "config") pod "0e392aad-9ae5-4942-a078-8ef9cbaffb90" (UID: "0e392aad-9ae5-4942-a078-8ef9cbaffb90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240112 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e392aad-9ae5-4942-a078-8ef9cbaffb90-serving-cert\") pod \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\" (UID: \"0e392aad-9ae5-4942-a078-8ef9cbaffb90\") " Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240517 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-client-ca" (OuterVolumeSpecName: "client-ca") pod "0e392aad-9ae5-4942-a078-8ef9cbaffb90" (UID: "0e392aad-9ae5-4942-a078-8ef9cbaffb90"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240626 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqw8r\" (UniqueName: \"kubernetes.io/projected/96bf498c-034c-431c-ae07-4099724a48a7-kube-api-access-vqw8r\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240662 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240672 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240683 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96bf498c-034c-431c-ae07-4099724a48a7-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240693 4553 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.240706 4553 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96bf498c-034c-431c-ae07-4099724a48a7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.243960 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e392aad-9ae5-4942-a078-8ef9cbaffb90-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0e392aad-9ae5-4942-a078-8ef9cbaffb90" (UID: "0e392aad-9ae5-4942-a078-8ef9cbaffb90"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.244131 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e392aad-9ae5-4942-a078-8ef9cbaffb90-kube-api-access-rbw75" (OuterVolumeSpecName: "kube-api-access-rbw75") pod "0e392aad-9ae5-4942-a078-8ef9cbaffb90" (UID: "0e392aad-9ae5-4942-a078-8ef9cbaffb90"). InnerVolumeSpecName "kube-api-access-rbw75". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.343121 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbw75\" (UniqueName: \"kubernetes.io/projected/0e392aad-9ae5-4942-a078-8ef9cbaffb90-kube-api-access-rbw75\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.343167 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e392aad-9ae5-4942-a078-8ef9cbaffb90-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.343189 4553 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e392aad-9ae5-4942-a078-8ef9cbaffb90-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.717415 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" event={"ID":"96bf498c-034c-431c-ae07-4099724a48a7","Type":"ContainerDied","Data":"5eaf684c2642f20caf01e6c5e97e920442e6e27fde9d0f73247cf38d2677083e"} Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.717467 4553 scope.go:117] "RemoveContainer" containerID="729718dc8f3535ff2e09a156c69f54432338cc37d1105fae6534c61833203332" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.717574 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5dq4n" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.723437 4553 generic.go:334] "Generic (PLEG): container finished" podID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerID="626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0" exitCode=0 Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.723545 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" event={"ID":"0e392aad-9ae5-4942-a078-8ef9cbaffb90","Type":"ContainerDied","Data":"626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0"} Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.723586 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" event={"ID":"0e392aad-9ae5-4942-a078-8ef9cbaffb90","Type":"ContainerDied","Data":"974e3056057203644dfd96178b3c5437e4dce3661ab2b48d1166d6a4adfe096b"} Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.723671 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.748370 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-xpxlt" event={"ID":"ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8","Type":"ContainerStarted","Data":"81b48533da818106124270b739cd779481cc4646b7655b9d6de9192d8be0e33d"} Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.749569 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.769625 4553 scope.go:117] "RemoveContainer" containerID="626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.784284 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-xpxlt" podStartSLOduration=6.590394565 podStartE2EDuration="14.784266915s" podCreationTimestamp="2025-09-30 19:44:38 +0000 UTC" firstStartedPulling="2025-09-30 19:44:39.164108688 +0000 UTC m=+732.363610838" lastFinishedPulling="2025-09-30 19:44:47.357981018 +0000 UTC m=+740.557483188" observedRunningTime="2025-09-30 19:44:52.783784611 +0000 UTC m=+745.983286761" watchObservedRunningTime="2025-09-30 19:44:52.784266915 +0000 UTC m=+745.983769055" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.801883 4553 scope.go:117] "RemoveContainer" containerID="626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0" Sep 30 19:44:52 crc kubenswrapper[4553]: E0930 19:44:52.802434 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0\": container with ID starting with 626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0 not found: ID does not exist" containerID="626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.802481 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0"} err="failed to get container status \"626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0\": rpc error: code = NotFound desc = could not find container \"626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0\": container with ID starting with 626ec327a7a45bd0959c33a538624e89ff3647a9d6159d0b2e9fb933345564b0 not found: ID does not exist" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.823441 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5dq4n"] Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.838061 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5dq4n"] Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.843602 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75fc6f68df-qtr27"] Sep 30 19:44:52 crc kubenswrapper[4553]: E0930 19:44:52.844722 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96bf498c-034c-431c-ae07-4099724a48a7" containerName="controller-manager" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.844796 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="96bf498c-034c-431c-ae07-4099724a48a7" containerName="controller-manager" Sep 30 19:44:52 crc kubenswrapper[4553]: E0930 19:44:52.844873 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerName="route-controller-manager" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.844922 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerName="route-controller-manager" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.845177 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="96bf498c-034c-431c-ae07-4099724a48a7" containerName="controller-manager" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.845268 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" containerName="route-controller-manager" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.845787 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.850522 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr"] Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.851531 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.851743 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852485 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852502 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-client-ca\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852529 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-config\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852547 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-client-ca\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852583 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-config\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852619 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8b6n\" (UniqueName: \"kubernetes.io/projected/6ad4790e-bb23-4375-9e44-be0b3d8182c9-kube-api-access-v8b6n\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852637 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852780 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852637 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-proxy-ca-bundles\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852905 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad4790e-bb23-4375-9e44-be0b3d8182c9-serving-cert\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852930 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szd48\" (UniqueName: \"kubernetes.io/projected/11cc6154-c65f-465f-a97b-b3c38cbdc249-kube-api-access-szd48\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.852951 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11cc6154-c65f-465f-a97b-b3c38cbdc249-serving-cert\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.853052 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.856836 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.856953 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m"] Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.859725 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.859947 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6tq2m"] Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.862358 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr"] Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.865055 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75fc6f68df-qtr27"] Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.869720 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.869939 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.870146 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.870335 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.870450 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.870562 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.953409 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8b6n\" (UniqueName: \"kubernetes.io/projected/6ad4790e-bb23-4375-9e44-be0b3d8182c9-kube-api-access-v8b6n\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.953764 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-proxy-ca-bundles\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.953815 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad4790e-bb23-4375-9e44-be0b3d8182c9-serving-cert\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.955081 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-proxy-ca-bundles\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.955135 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szd48\" (UniqueName: \"kubernetes.io/projected/11cc6154-c65f-465f-a97b-b3c38cbdc249-kube-api-access-szd48\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.955165 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11cc6154-c65f-465f-a97b-b3c38cbdc249-serving-cert\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.955186 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-client-ca\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.955319 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-config\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.956012 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-client-ca\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.956333 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-config\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.956382 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-client-ca\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.958422 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11cc6154-c65f-465f-a97b-b3c38cbdc249-serving-cert\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.955344 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-client-ca\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.959536 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad4790e-bb23-4375-9e44-be0b3d8182c9-serving-cert\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.959566 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-config\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.961056 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11cc6154-c65f-465f-a97b-b3c38cbdc249-config\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.981031 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8b6n\" (UniqueName: \"kubernetes.io/projected/6ad4790e-bb23-4375-9e44-be0b3d8182c9-kube-api-access-v8b6n\") pod \"route-controller-manager-78bb848964-zh9wr\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:52 crc kubenswrapper[4553]: I0930 19:44:52.982700 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szd48\" (UniqueName: \"kubernetes.io/projected/11cc6154-c65f-465f-a97b-b3c38cbdc249-kube-api-access-szd48\") pod \"controller-manager-75fc6f68df-qtr27\" (UID: \"11cc6154-c65f-465f-a97b-b3c38cbdc249\") " pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.035501 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr"] Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.035932 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.168074 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.240561 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr"] Sep 30 19:44:53 crc kubenswrapper[4553]: W0930 19:44:53.246573 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ad4790e_bb23_4375_9e44_be0b3d8182c9.slice/crio-7d6b750d81ccac7de77244ad97672ad7bab01d56a4f6849e3caf91969ce511b3 WatchSource:0}: Error finding container 7d6b750d81ccac7de77244ad97672ad7bab01d56a4f6849e3caf91969ce511b3: Status 404 returned error can't find the container with id 7d6b750d81ccac7de77244ad97672ad7bab01d56a4f6849e3caf91969ce511b3 Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.370698 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75fc6f68df-qtr27"] Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.516309 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e392aad-9ae5-4942-a078-8ef9cbaffb90" path="/var/lib/kubelet/pods/0e392aad-9ae5-4942-a078-8ef9cbaffb90/volumes" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.518253 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96bf498c-034c-431c-ae07-4099724a48a7" path="/var/lib/kubelet/pods/96bf498c-034c-431c-ae07-4099724a48a7/volumes" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.756553 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" event={"ID":"6ad4790e-bb23-4375-9e44-be0b3d8182c9","Type":"ContainerStarted","Data":"aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8"} Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.756605 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" event={"ID":"6ad4790e-bb23-4375-9e44-be0b3d8182c9","Type":"ContainerStarted","Data":"7d6b750d81ccac7de77244ad97672ad7bab01d56a4f6849e3caf91969ce511b3"} Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.757840 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" podUID="6ad4790e-bb23-4375-9e44-be0b3d8182c9" containerName="route-controller-manager" containerID="cri-o://aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8" gracePeriod=30 Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.758777 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" event={"ID":"11cc6154-c65f-465f-a97b-b3c38cbdc249","Type":"ContainerStarted","Data":"4b557ff86fab2f6e075ba89e122636b586c5d28610228368e00d9036b2ac8958"} Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.758803 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" event={"ID":"11cc6154-c65f-465f-a97b-b3c38cbdc249","Type":"ContainerStarted","Data":"f006f354469ef540be7c4cd509ae3565e8442a0b4b391d7007c099742d17178b"} Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.759056 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.774084 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.817518 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" podStartSLOduration=2.8174874709999997 podStartE2EDuration="2.817487471s" podCreationTimestamp="2025-09-30 19:44:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:44:53.816068094 +0000 UTC m=+747.015570214" watchObservedRunningTime="2025-09-30 19:44:53.817487471 +0000 UTC m=+747.016989601" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.848388 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75fc6f68df-qtr27" podStartSLOduration=2.848368183 podStartE2EDuration="2.848368183s" podCreationTimestamp="2025-09-30 19:44:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:44:53.844125342 +0000 UTC m=+747.043627472" watchObservedRunningTime="2025-09-30 19:44:53.848368183 +0000 UTC m=+747.047870313" Sep 30 19:44:53 crc kubenswrapper[4553]: I0930 19:44:53.986073 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.140503 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.290753 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.338112 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292"] Sep 30 19:44:54 crc kubenswrapper[4553]: E0930 19:44:54.338354 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad4790e-bb23-4375-9e44-be0b3d8182c9" containerName="route-controller-manager" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.338372 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad4790e-bb23-4375-9e44-be0b3d8182c9" containerName="route-controller-manager" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.338491 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad4790e-bb23-4375-9e44-be0b3d8182c9" containerName="route-controller-manager" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.338861 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.361482 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292"] Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.376991 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-client-ca\") pod \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.377985 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-client-ca" (OuterVolumeSpecName: "client-ca") pod "6ad4790e-bb23-4375-9e44-be0b3d8182c9" (UID: "6ad4790e-bb23-4375-9e44-be0b3d8182c9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.378092 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8b6n\" (UniqueName: \"kubernetes.io/projected/6ad4790e-bb23-4375-9e44-be0b3d8182c9-kube-api-access-v8b6n\") pod \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.378767 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad4790e-bb23-4375-9e44-be0b3d8182c9-serving-cert\") pod \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.378826 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-config\") pod \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\" (UID: \"6ad4790e-bb23-4375-9e44-be0b3d8182c9\") " Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.379126 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/823e6369-b622-4cb2-b3ab-53997ac63a08-client-ca\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.379192 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/823e6369-b622-4cb2-b3ab-53997ac63a08-config\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.379238 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff4wp\" (UniqueName: \"kubernetes.io/projected/823e6369-b622-4cb2-b3ab-53997ac63a08-kube-api-access-ff4wp\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.379270 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/823e6369-b622-4cb2-b3ab-53997ac63a08-serving-cert\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.379319 4553 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.379640 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-config" (OuterVolumeSpecName: "config") pod "6ad4790e-bb23-4375-9e44-be0b3d8182c9" (UID: "6ad4790e-bb23-4375-9e44-be0b3d8182c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.387513 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad4790e-bb23-4375-9e44-be0b3d8182c9-kube-api-access-v8b6n" (OuterVolumeSpecName: "kube-api-access-v8b6n") pod "6ad4790e-bb23-4375-9e44-be0b3d8182c9" (UID: "6ad4790e-bb23-4375-9e44-be0b3d8182c9"). InnerVolumeSpecName "kube-api-access-v8b6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.394298 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ad4790e-bb23-4375-9e44-be0b3d8182c9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6ad4790e-bb23-4375-9e44-be0b3d8182c9" (UID: "6ad4790e-bb23-4375-9e44-be0b3d8182c9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.480287 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/823e6369-b622-4cb2-b3ab-53997ac63a08-config\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.480345 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff4wp\" (UniqueName: \"kubernetes.io/projected/823e6369-b622-4cb2-b3ab-53997ac63a08-kube-api-access-ff4wp\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.480375 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/823e6369-b622-4cb2-b3ab-53997ac63a08-serving-cert\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.480417 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/823e6369-b622-4cb2-b3ab-53997ac63a08-client-ca\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.480479 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8b6n\" (UniqueName: \"kubernetes.io/projected/6ad4790e-bb23-4375-9e44-be0b3d8182c9-kube-api-access-v8b6n\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.480490 4553 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad4790e-bb23-4375-9e44-be0b3d8182c9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.480502 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad4790e-bb23-4375-9e44-be0b3d8182c9-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.481544 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/823e6369-b622-4cb2-b3ab-53997ac63a08-client-ca\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.481619 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/823e6369-b622-4cb2-b3ab-53997ac63a08-config\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.485778 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/823e6369-b622-4cb2-b3ab-53997ac63a08-serving-cert\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.500786 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff4wp\" (UniqueName: \"kubernetes.io/projected/823e6369-b622-4cb2-b3ab-53997ac63a08-kube-api-access-ff4wp\") pod \"route-controller-manager-7c54669b74-dd292\" (UID: \"823e6369-b622-4cb2-b3ab-53997ac63a08\") " pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.660860 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.769265 4553 generic.go:334] "Generic (PLEG): container finished" podID="6ad4790e-bb23-4375-9e44-be0b3d8182c9" containerID="aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8" exitCode=0 Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.769387 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" event={"ID":"6ad4790e-bb23-4375-9e44-be0b3d8182c9","Type":"ContainerDied","Data":"aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8"} Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.769942 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" event={"ID":"6ad4790e-bb23-4375-9e44-be0b3d8182c9","Type":"ContainerDied","Data":"7d6b750d81ccac7de77244ad97672ad7bab01d56a4f6849e3caf91969ce511b3"} Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.769963 4553 scope.go:117] "RemoveContainer" containerID="aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.769475 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.815980 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr"] Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.816189 4553 scope.go:117] "RemoveContainer" containerID="aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8" Sep 30 19:44:54 crc kubenswrapper[4553]: E0930 19:44:54.816613 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8\": container with ID starting with aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8 not found: ID does not exist" containerID="aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.816637 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8"} err="failed to get container status \"aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8\": rpc error: code = NotFound desc = could not find container \"aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8\": container with ID starting with aa655b01df093630695cbb3895c0b177a76d0d9b2c13b13b066de0a497899aa8 not found: ID does not exist" Sep 30 19:44:54 crc kubenswrapper[4553]: I0930 19:44:54.827401 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78bb848964-zh9wr"] Sep 30 19:44:55 crc kubenswrapper[4553]: I0930 19:44:55.155788 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292"] Sep 30 19:44:55 crc kubenswrapper[4553]: I0930 19:44:55.527238 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad4790e-bb23-4375-9e44-be0b3d8182c9" path="/var/lib/kubelet/pods/6ad4790e-bb23-4375-9e44-be0b3d8182c9/volumes" Sep 30 19:44:55 crc kubenswrapper[4553]: I0930 19:44:55.779532 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" event={"ID":"823e6369-b622-4cb2-b3ab-53997ac63a08","Type":"ContainerStarted","Data":"2a156905ff56ad8d6e20336b0e5bdd4cecc4e9b7f7bf13b000d2066b9f4365f0"} Sep 30 19:44:55 crc kubenswrapper[4553]: I0930 19:44:55.779574 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" event={"ID":"823e6369-b622-4cb2-b3ab-53997ac63a08","Type":"ContainerStarted","Data":"eef0bd2ab83d8708579ffada74e0638e84bad6f54e0128b88a0b8371509b52cb"} Sep 30 19:44:55 crc kubenswrapper[4553]: I0930 19:44:55.805252 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" podStartSLOduration=2.805233643 podStartE2EDuration="2.805233643s" podCreationTimestamp="2025-09-30 19:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:44:55.801417974 +0000 UTC m=+749.000920104" watchObservedRunningTime="2025-09-30 19:44:55.805233643 +0000 UTC m=+749.004735773" Sep 30 19:44:56 crc kubenswrapper[4553]: I0930 19:44:56.785410 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:56 crc kubenswrapper[4553]: I0930 19:44:56.791813 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c54669b74-dd292" Sep 30 19:44:59 crc kubenswrapper[4553]: I0930 19:44:59.003924 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-f4rnh" Sep 30 19:44:59 crc kubenswrapper[4553]: I0930 19:44:59.585580 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:44:59 crc kubenswrapper[4553]: I0930 19:44:59.585954 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:44:59 crc kubenswrapper[4553]: I0930 19:44:59.603862 4553 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 19:44:59 crc kubenswrapper[4553]: I0930 19:44:59.974605 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5k2m2" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.144427 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf"] Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.145350 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.149048 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.152233 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.158526 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf"] Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.193992 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51b24d99-4741-470e-8664-1ef873cc39ad-secret-volume\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.194154 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw99q\" (UniqueName: \"kubernetes.io/projected/51b24d99-4741-470e-8664-1ef873cc39ad-kube-api-access-nw99q\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.194201 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51b24d99-4741-470e-8664-1ef873cc39ad-config-volume\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.295694 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51b24d99-4741-470e-8664-1ef873cc39ad-secret-volume\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.295745 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw99q\" (UniqueName: \"kubernetes.io/projected/51b24d99-4741-470e-8664-1ef873cc39ad-kube-api-access-nw99q\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.295788 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51b24d99-4741-470e-8664-1ef873cc39ad-config-volume\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.296784 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51b24d99-4741-470e-8664-1ef873cc39ad-config-volume\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.306645 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51b24d99-4741-470e-8664-1ef873cc39ad-secret-volume\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.320892 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw99q\" (UniqueName: \"kubernetes.io/projected/51b24d99-4741-470e-8664-1ef873cc39ad-kube-api-access-nw99q\") pod \"collect-profiles-29321025-8kftf\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.459934 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:00 crc kubenswrapper[4553]: I0930 19:45:00.921933 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf"] Sep 30 19:45:00 crc kubenswrapper[4553]: W0930 19:45:00.925379 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b24d99_4741_470e_8664_1ef873cc39ad.slice/crio-73cc22bfc4c6222d1d59d5d322d9e459c51d898bbe8db8c6f7e4ee7b409c010b WatchSource:0}: Error finding container 73cc22bfc4c6222d1d59d5d322d9e459c51d898bbe8db8c6f7e4ee7b409c010b: Status 404 returned error can't find the container with id 73cc22bfc4c6222d1d59d5d322d9e459c51d898bbe8db8c6f7e4ee7b409c010b Sep 30 19:45:01 crc kubenswrapper[4553]: I0930 19:45:01.817103 4553 generic.go:334] "Generic (PLEG): container finished" podID="51b24d99-4741-470e-8664-1ef873cc39ad" containerID="2121568ed4f12b3bbdfc4834a7cab922b9e376e063a329c67954b7ee4d71a7eb" exitCode=0 Sep 30 19:45:01 crc kubenswrapper[4553]: I0930 19:45:01.817226 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" event={"ID":"51b24d99-4741-470e-8664-1ef873cc39ad","Type":"ContainerDied","Data":"2121568ed4f12b3bbdfc4834a7cab922b9e376e063a329c67954b7ee4d71a7eb"} Sep 30 19:45:01 crc kubenswrapper[4553]: I0930 19:45:01.817483 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" event={"ID":"51b24d99-4741-470e-8664-1ef873cc39ad","Type":"ContainerStarted","Data":"73cc22bfc4c6222d1d59d5d322d9e459c51d898bbe8db8c6f7e4ee7b409c010b"} Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.222591 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.342586 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51b24d99-4741-470e-8664-1ef873cc39ad-secret-volume\") pod \"51b24d99-4741-470e-8664-1ef873cc39ad\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.342718 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51b24d99-4741-470e-8664-1ef873cc39ad-config-volume\") pod \"51b24d99-4741-470e-8664-1ef873cc39ad\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.342850 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw99q\" (UniqueName: \"kubernetes.io/projected/51b24d99-4741-470e-8664-1ef873cc39ad-kube-api-access-nw99q\") pod \"51b24d99-4741-470e-8664-1ef873cc39ad\" (UID: \"51b24d99-4741-470e-8664-1ef873cc39ad\") " Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.343644 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51b24d99-4741-470e-8664-1ef873cc39ad-config-volume" (OuterVolumeSpecName: "config-volume") pod "51b24d99-4741-470e-8664-1ef873cc39ad" (UID: "51b24d99-4741-470e-8664-1ef873cc39ad"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.349784 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b24d99-4741-470e-8664-1ef873cc39ad-kube-api-access-nw99q" (OuterVolumeSpecName: "kube-api-access-nw99q") pod "51b24d99-4741-470e-8664-1ef873cc39ad" (UID: "51b24d99-4741-470e-8664-1ef873cc39ad"). InnerVolumeSpecName "kube-api-access-nw99q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.356247 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b24d99-4741-470e-8664-1ef873cc39ad-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "51b24d99-4741-470e-8664-1ef873cc39ad" (UID: "51b24d99-4741-470e-8664-1ef873cc39ad"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.444931 4553 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/51b24d99-4741-470e-8664-1ef873cc39ad-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.444984 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw99q\" (UniqueName: \"kubernetes.io/projected/51b24d99-4741-470e-8664-1ef873cc39ad-kube-api-access-nw99q\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.445008 4553 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/51b24d99-4741-470e-8664-1ef873cc39ad-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.464104 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5945h"] Sep 30 19:45:03 crc kubenswrapper[4553]: E0930 19:45:03.464515 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b24d99-4741-470e-8664-1ef873cc39ad" containerName="collect-profiles" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.464582 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b24d99-4741-470e-8664-1ef873cc39ad" containerName="collect-profiles" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.464762 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b24d99-4741-470e-8664-1ef873cc39ad" containerName="collect-profiles" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.465198 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5945h" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.469304 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.469554 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.512765 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5945h"] Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.546475 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4d4v\" (UniqueName: \"kubernetes.io/projected/b7d464ae-23b5-46d0-9c22-53c79a0536b6-kube-api-access-c4d4v\") pod \"openstack-operator-index-5945h\" (UID: \"b7d464ae-23b5-46d0-9c22-53c79a0536b6\") " pod="openstack-operators/openstack-operator-index-5945h" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.648345 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4d4v\" (UniqueName: \"kubernetes.io/projected/b7d464ae-23b5-46d0-9c22-53c79a0536b6-kube-api-access-c4d4v\") pod \"openstack-operator-index-5945h\" (UID: \"b7d464ae-23b5-46d0-9c22-53c79a0536b6\") " pod="openstack-operators/openstack-operator-index-5945h" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.668589 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4d4v\" (UniqueName: \"kubernetes.io/projected/b7d464ae-23b5-46d0-9c22-53c79a0536b6-kube-api-access-c4d4v\") pod \"openstack-operator-index-5945h\" (UID: \"b7d464ae-23b5-46d0-9c22-53c79a0536b6\") " pod="openstack-operators/openstack-operator-index-5945h" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.779169 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5945h" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.843076 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" event={"ID":"51b24d99-4741-470e-8664-1ef873cc39ad","Type":"ContainerDied","Data":"73cc22bfc4c6222d1d59d5d322d9e459c51d898bbe8db8c6f7e4ee7b409c010b"} Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.843492 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73cc22bfc4c6222d1d59d5d322d9e459c51d898bbe8db8c6f7e4ee7b409c010b" Sep 30 19:45:03 crc kubenswrapper[4553]: I0930 19:45:03.843142 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321025-8kftf" Sep 30 19:45:04 crc kubenswrapper[4553]: I0930 19:45:04.225949 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5945h"] Sep 30 19:45:04 crc kubenswrapper[4553]: W0930 19:45:04.237414 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7d464ae_23b5_46d0_9c22_53c79a0536b6.slice/crio-b0ca78b1d22333a0cffa469f32db5f2747deb78796a568af10e4b1a19a723d01 WatchSource:0}: Error finding container b0ca78b1d22333a0cffa469f32db5f2747deb78796a568af10e4b1a19a723d01: Status 404 returned error can't find the container with id b0ca78b1d22333a0cffa469f32db5f2747deb78796a568af10e4b1a19a723d01 Sep 30 19:45:04 crc kubenswrapper[4553]: I0930 19:45:04.851863 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5945h" event={"ID":"b7d464ae-23b5-46d0-9c22-53c79a0536b6","Type":"ContainerStarted","Data":"b0ca78b1d22333a0cffa469f32db5f2747deb78796a568af10e4b1a19a723d01"} Sep 30 19:45:05 crc kubenswrapper[4553]: I0930 19:45:05.826452 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5945h"] Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.436013 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6w8vg"] Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.442967 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.445824 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-4s64j" Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.460564 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6w8vg"] Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.483283 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zdh9\" (UniqueName: \"kubernetes.io/projected/b9037a2f-77dd-423b-9d81-432c9a554e15-kube-api-access-4zdh9\") pod \"openstack-operator-index-6w8vg\" (UID: \"b9037a2f-77dd-423b-9d81-432c9a554e15\") " pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.584718 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zdh9\" (UniqueName: \"kubernetes.io/projected/b9037a2f-77dd-423b-9d81-432c9a554e15-kube-api-access-4zdh9\") pod \"openstack-operator-index-6w8vg\" (UID: \"b9037a2f-77dd-423b-9d81-432c9a554e15\") " pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.609219 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zdh9\" (UniqueName: \"kubernetes.io/projected/b9037a2f-77dd-423b-9d81-432c9a554e15-kube-api-access-4zdh9\") pod \"openstack-operator-index-6w8vg\" (UID: \"b9037a2f-77dd-423b-9d81-432c9a554e15\") " pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.770414 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.867153 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5945h" event={"ID":"b7d464ae-23b5-46d0-9c22-53c79a0536b6","Type":"ContainerStarted","Data":"da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad"} Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.868130 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5945h" podUID="b7d464ae-23b5-46d0-9c22-53c79a0536b6" containerName="registry-server" containerID="cri-o://da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad" gracePeriod=2 Sep 30 19:45:06 crc kubenswrapper[4553]: I0930 19:45:06.894665 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5945h" podStartSLOduration=1.467871815 podStartE2EDuration="3.894644965s" podCreationTimestamp="2025-09-30 19:45:03 +0000 UTC" firstStartedPulling="2025-09-30 19:45:04.240905264 +0000 UTC m=+757.440407394" lastFinishedPulling="2025-09-30 19:45:06.667678404 +0000 UTC m=+759.867180544" observedRunningTime="2025-09-30 19:45:06.88915812 +0000 UTC m=+760.088660250" watchObservedRunningTime="2025-09-30 19:45:06.894644965 +0000 UTC m=+760.094147095" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.189732 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6w8vg"] Sep 30 19:45:07 crc kubenswrapper[4553]: W0930 19:45:07.195172 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9037a2f_77dd_423b_9d81_432c9a554e15.slice/crio-8efdc1c129fae8c9e7e12462ee8b9d92385fd1064c60118a613f219aa8f7f218 WatchSource:0}: Error finding container 8efdc1c129fae8c9e7e12462ee8b9d92385fd1064c60118a613f219aa8f7f218: Status 404 returned error can't find the container with id 8efdc1c129fae8c9e7e12462ee8b9d92385fd1064c60118a613f219aa8f7f218 Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.287224 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5945h" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.396168 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4d4v\" (UniqueName: \"kubernetes.io/projected/b7d464ae-23b5-46d0-9c22-53c79a0536b6-kube-api-access-c4d4v\") pod \"b7d464ae-23b5-46d0-9c22-53c79a0536b6\" (UID: \"b7d464ae-23b5-46d0-9c22-53c79a0536b6\") " Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.402168 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d464ae-23b5-46d0-9c22-53c79a0536b6-kube-api-access-c4d4v" (OuterVolumeSpecName: "kube-api-access-c4d4v") pod "b7d464ae-23b5-46d0-9c22-53c79a0536b6" (UID: "b7d464ae-23b5-46d0-9c22-53c79a0536b6"). InnerVolumeSpecName "kube-api-access-c4d4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.498229 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4d4v\" (UniqueName: \"kubernetes.io/projected/b7d464ae-23b5-46d0-9c22-53c79a0536b6-kube-api-access-c4d4v\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.872789 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6w8vg" event={"ID":"b9037a2f-77dd-423b-9d81-432c9a554e15","Type":"ContainerStarted","Data":"b3e293bd5f656eeaa0bfe91391003c4029bdbcfd16a510efaa066d6ebc2f8892"} Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.873114 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6w8vg" event={"ID":"b9037a2f-77dd-423b-9d81-432c9a554e15","Type":"ContainerStarted","Data":"8efdc1c129fae8c9e7e12462ee8b9d92385fd1064c60118a613f219aa8f7f218"} Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.874696 4553 generic.go:334] "Generic (PLEG): container finished" podID="b7d464ae-23b5-46d0-9c22-53c79a0536b6" containerID="da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad" exitCode=0 Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.874739 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5945h" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.874740 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5945h" event={"ID":"b7d464ae-23b5-46d0-9c22-53c79a0536b6","Type":"ContainerDied","Data":"da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad"} Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.874879 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5945h" event={"ID":"b7d464ae-23b5-46d0-9c22-53c79a0536b6","Type":"ContainerDied","Data":"b0ca78b1d22333a0cffa469f32db5f2747deb78796a568af10e4b1a19a723d01"} Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.874912 4553 scope.go:117] "RemoveContainer" containerID="da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.890501 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6w8vg" podStartSLOduration=1.8389940930000002 podStartE2EDuration="1.890456727s" podCreationTimestamp="2025-09-30 19:45:06 +0000 UTC" firstStartedPulling="2025-09-30 19:45:07.198061184 +0000 UTC m=+760.397563314" lastFinishedPulling="2025-09-30 19:45:07.249523818 +0000 UTC m=+760.449025948" observedRunningTime="2025-09-30 19:45:07.885740982 +0000 UTC m=+761.085243112" watchObservedRunningTime="2025-09-30 19:45:07.890456727 +0000 UTC m=+761.089958877" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.903409 4553 scope.go:117] "RemoveContainer" containerID="da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad" Sep 30 19:45:07 crc kubenswrapper[4553]: E0930 19:45:07.903723 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad\": container with ID starting with da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad not found: ID does not exist" containerID="da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.903761 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad"} err="failed to get container status \"da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad\": rpc error: code = NotFound desc = could not find container \"da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad\": container with ID starting with da15ec10d02799e0e7ccb7563c5fec45547d9af2900d7942192b2bc18da5f4ad not found: ID does not exist" Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.906800 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5945h"] Sep 30 19:45:07 crc kubenswrapper[4553]: I0930 19:45:07.912184 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5945h"] Sep 30 19:45:08 crc kubenswrapper[4553]: I0930 19:45:08.989265 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-xpxlt" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.243832 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vlsw5"] Sep 30 19:45:09 crc kubenswrapper[4553]: E0930 19:45:09.244249 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d464ae-23b5-46d0-9c22-53c79a0536b6" containerName="registry-server" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.244279 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d464ae-23b5-46d0-9c22-53c79a0536b6" containerName="registry-server" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.244482 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d464ae-23b5-46d0-9c22-53c79a0536b6" containerName="registry-server" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.245914 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.287329 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vlsw5"] Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.324833 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4lzr\" (UniqueName: \"kubernetes.io/projected/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-kube-api-access-g4lzr\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.324965 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-utilities\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.325108 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-catalog-content\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.426574 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-catalog-content\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.426871 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4lzr\" (UniqueName: \"kubernetes.io/projected/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-kube-api-access-g4lzr\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.426919 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-utilities\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.427879 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-utilities\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.440166 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-catalog-content\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.463437 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4lzr\" (UniqueName: \"kubernetes.io/projected/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-kube-api-access-g4lzr\") pod \"redhat-operators-vlsw5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.515156 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d464ae-23b5-46d0-9c22-53c79a0536b6" path="/var/lib/kubelet/pods/b7d464ae-23b5-46d0-9c22-53c79a0536b6/volumes" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.577458 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:09 crc kubenswrapper[4553]: I0930 19:45:09.974661 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vlsw5"] Sep 30 19:45:10 crc kubenswrapper[4553]: I0930 19:45:10.907110 4553 generic.go:334] "Generic (PLEG): container finished" podID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerID="f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e" exitCode=0 Sep 30 19:45:10 crc kubenswrapper[4553]: I0930 19:45:10.907169 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlsw5" event={"ID":"2af749ee-af7b-49a4-9b1a-6058a96a7bc5","Type":"ContainerDied","Data":"f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e"} Sep 30 19:45:10 crc kubenswrapper[4553]: I0930 19:45:10.907221 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlsw5" event={"ID":"2af749ee-af7b-49a4-9b1a-6058a96a7bc5","Type":"ContainerStarted","Data":"46e750361203c3b22b7179eec8884e63c29b33ab13c237894eaf0cd170054749"} Sep 30 19:45:11 crc kubenswrapper[4553]: I0930 19:45:11.914806 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlsw5" event={"ID":"2af749ee-af7b-49a4-9b1a-6058a96a7bc5","Type":"ContainerStarted","Data":"7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce"} Sep 30 19:45:12 crc kubenswrapper[4553]: I0930 19:45:12.928419 4553 generic.go:334] "Generic (PLEG): container finished" podID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerID="7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce" exitCode=0 Sep 30 19:45:12 crc kubenswrapper[4553]: I0930 19:45:12.928545 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlsw5" event={"ID":"2af749ee-af7b-49a4-9b1a-6058a96a7bc5","Type":"ContainerDied","Data":"7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce"} Sep 30 19:45:13 crc kubenswrapper[4553]: I0930 19:45:13.936900 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlsw5" event={"ID":"2af749ee-af7b-49a4-9b1a-6058a96a7bc5","Type":"ContainerStarted","Data":"4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899"} Sep 30 19:45:13 crc kubenswrapper[4553]: I0930 19:45:13.988199 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vlsw5" podStartSLOduration=2.566283969 podStartE2EDuration="4.988174552s" podCreationTimestamp="2025-09-30 19:45:09 +0000 UTC" firstStartedPulling="2025-09-30 19:45:10.914992219 +0000 UTC m=+764.114494389" lastFinishedPulling="2025-09-30 19:45:13.336882812 +0000 UTC m=+766.536384972" observedRunningTime="2025-09-30 19:45:13.98390684 +0000 UTC m=+767.183409000" watchObservedRunningTime="2025-09-30 19:45:13.988174552 +0000 UTC m=+767.187676712" Sep 30 19:45:16 crc kubenswrapper[4553]: I0930 19:45:16.770552 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:16 crc kubenswrapper[4553]: I0930 19:45:16.770956 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:16 crc kubenswrapper[4553]: I0930 19:45:16.807689 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:16 crc kubenswrapper[4553]: I0930 19:45:16.986870 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-6w8vg" Sep 30 19:45:17 crc kubenswrapper[4553]: I0930 19:45:17.905450 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj"] Sep 30 19:45:17 crc kubenswrapper[4553]: I0930 19:45:17.906829 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:17 crc kubenswrapper[4553]: I0930 19:45:17.915540 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-stzx8" Sep 30 19:45:17 crc kubenswrapper[4553]: I0930 19:45:17.924368 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj"] Sep 30 19:45:17 crc kubenswrapper[4553]: I0930 19:45:17.949705 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl8xb\" (UniqueName: \"kubernetes.io/projected/5aa35519-bdc4-4eb7-a039-7238829d51ac-kube-api-access-tl8xb\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:17 crc kubenswrapper[4553]: I0930 19:45:17.949765 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-util\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:17 crc kubenswrapper[4553]: I0930 19:45:17.949787 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-bundle\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.053288 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl8xb\" (UniqueName: \"kubernetes.io/projected/5aa35519-bdc4-4eb7-a039-7238829d51ac-kube-api-access-tl8xb\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.053448 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-util\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.053490 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-bundle\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.054186 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-util\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.054255 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-bundle\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.076122 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl8xb\" (UniqueName: \"kubernetes.io/projected/5aa35519-bdc4-4eb7-a039-7238829d51ac-kube-api-access-tl8xb\") pod \"53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.232344 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.749861 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj"] Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.972826 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" event={"ID":"5aa35519-bdc4-4eb7-a039-7238829d51ac","Type":"ContainerStarted","Data":"2e6b3448899500ce0a7814e82ae6bd0b0d19a3525a6263d851312b9ee1a316b3"} Sep 30 19:45:18 crc kubenswrapper[4553]: I0930 19:45:18.972873 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" event={"ID":"5aa35519-bdc4-4eb7-a039-7238829d51ac","Type":"ContainerStarted","Data":"c77e3fe95b7ffbe0a4db521cbb82f1991b66d7ec95b38127b8452b2170a19f84"} Sep 30 19:45:19 crc kubenswrapper[4553]: I0930 19:45:19.577941 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:19 crc kubenswrapper[4553]: I0930 19:45:19.578229 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:19 crc kubenswrapper[4553]: I0930 19:45:19.615411 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:19 crc kubenswrapper[4553]: I0930 19:45:19.984737 4553 generic.go:334] "Generic (PLEG): container finished" podID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerID="2e6b3448899500ce0a7814e82ae6bd0b0d19a3525a6263d851312b9ee1a316b3" exitCode=0 Sep 30 19:45:19 crc kubenswrapper[4553]: I0930 19:45:19.984800 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" event={"ID":"5aa35519-bdc4-4eb7-a039-7238829d51ac","Type":"ContainerDied","Data":"2e6b3448899500ce0a7814e82ae6bd0b0d19a3525a6263d851312b9ee1a316b3"} Sep 30 19:45:20 crc kubenswrapper[4553]: I0930 19:45:20.052833 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:20 crc kubenswrapper[4553]: I0930 19:45:20.995619 4553 generic.go:334] "Generic (PLEG): container finished" podID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerID="c590e3319017822c62c0d0d142d28ffbb8172f921bb95dd5f1e30ee01e810dcc" exitCode=0 Sep 30 19:45:20 crc kubenswrapper[4553]: I0930 19:45:20.995707 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" event={"ID":"5aa35519-bdc4-4eb7-a039-7238829d51ac","Type":"ContainerDied","Data":"c590e3319017822c62c0d0d142d28ffbb8172f921bb95dd5f1e30ee01e810dcc"} Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.010278 4553 generic.go:334] "Generic (PLEG): container finished" podID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerID="1523e2d0b73e782bc927158d34bdd0bc86dbe1b22a0807c7c1d2b3a326a3d773" exitCode=0 Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.010351 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" event={"ID":"5aa35519-bdc4-4eb7-a039-7238829d51ac","Type":"ContainerDied","Data":"1523e2d0b73e782bc927158d34bdd0bc86dbe1b22a0807c7c1d2b3a326a3d773"} Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.036885 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vlsw5"] Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.037870 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vlsw5" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerName="registry-server" containerID="cri-o://4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899" gracePeriod=2 Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.628195 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.718856 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-utilities\") pod \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.718980 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4lzr\" (UniqueName: \"kubernetes.io/projected/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-kube-api-access-g4lzr\") pod \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.720648 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-catalog-content\") pod \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\" (UID: \"2af749ee-af7b-49a4-9b1a-6058a96a7bc5\") " Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.721271 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-utilities" (OuterVolumeSpecName: "utilities") pod "2af749ee-af7b-49a4-9b1a-6058a96a7bc5" (UID: "2af749ee-af7b-49a4-9b1a-6058a96a7bc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.721512 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.727494 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-kube-api-access-g4lzr" (OuterVolumeSpecName: "kube-api-access-g4lzr") pod "2af749ee-af7b-49a4-9b1a-6058a96a7bc5" (UID: "2af749ee-af7b-49a4-9b1a-6058a96a7bc5"). InnerVolumeSpecName "kube-api-access-g4lzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:45:22 crc kubenswrapper[4553]: I0930 19:45:22.823113 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4lzr\" (UniqueName: \"kubernetes.io/projected/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-kube-api-access-g4lzr\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.025035 4553 generic.go:334] "Generic (PLEG): container finished" podID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerID="4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899" exitCode=0 Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.025167 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vlsw5" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.025227 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlsw5" event={"ID":"2af749ee-af7b-49a4-9b1a-6058a96a7bc5","Type":"ContainerDied","Data":"4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899"} Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.025269 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vlsw5" event={"ID":"2af749ee-af7b-49a4-9b1a-6058a96a7bc5","Type":"ContainerDied","Data":"46e750361203c3b22b7179eec8884e63c29b33ab13c237894eaf0cd170054749"} Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.025301 4553 scope.go:117] "RemoveContainer" containerID="4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.050785 4553 scope.go:117] "RemoveContainer" containerID="7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.087387 4553 scope.go:117] "RemoveContainer" containerID="f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.111589 4553 scope.go:117] "RemoveContainer" containerID="4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899" Sep 30 19:45:23 crc kubenswrapper[4553]: E0930 19:45:23.112355 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899\": container with ID starting with 4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899 not found: ID does not exist" containerID="4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.112406 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899"} err="failed to get container status \"4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899\": rpc error: code = NotFound desc = could not find container \"4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899\": container with ID starting with 4fad4a64af09b485e1c3f69e52fac89cd830935da39d836c8f4b08349ab11899 not found: ID does not exist" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.112441 4553 scope.go:117] "RemoveContainer" containerID="7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce" Sep 30 19:45:23 crc kubenswrapper[4553]: E0930 19:45:23.112894 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce\": container with ID starting with 7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce not found: ID does not exist" containerID="7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.112938 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce"} err="failed to get container status \"7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce\": rpc error: code = NotFound desc = could not find container \"7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce\": container with ID starting with 7ae846c35c9dcb89fd21eeb70b7c52c291cffa959ac2d029537888e3fff557ce not found: ID does not exist" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.112966 4553 scope.go:117] "RemoveContainer" containerID="f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e" Sep 30 19:45:23 crc kubenswrapper[4553]: E0930 19:45:23.113761 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e\": container with ID starting with f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e not found: ID does not exist" containerID="f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.113855 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e"} err="failed to get container status \"f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e\": rpc error: code = NotFound desc = could not find container \"f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e\": container with ID starting with f665242b7d0328f1190382a4ff2acc6491699e44bdfee387355f6d3745ce055e not found: ID does not exist" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.426130 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.530664 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-util\") pod \"5aa35519-bdc4-4eb7-a039-7238829d51ac\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.530740 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-bundle\") pod \"5aa35519-bdc4-4eb7-a039-7238829d51ac\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.530850 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl8xb\" (UniqueName: \"kubernetes.io/projected/5aa35519-bdc4-4eb7-a039-7238829d51ac-kube-api-access-tl8xb\") pod \"5aa35519-bdc4-4eb7-a039-7238829d51ac\" (UID: \"5aa35519-bdc4-4eb7-a039-7238829d51ac\") " Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.531405 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-bundle" (OuterVolumeSpecName: "bundle") pod "5aa35519-bdc4-4eb7-a039-7238829d51ac" (UID: "5aa35519-bdc4-4eb7-a039-7238829d51ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.531801 4553 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.537903 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa35519-bdc4-4eb7-a039-7238829d51ac-kube-api-access-tl8xb" (OuterVolumeSpecName: "kube-api-access-tl8xb") pod "5aa35519-bdc4-4eb7-a039-7238829d51ac" (UID: "5aa35519-bdc4-4eb7-a039-7238829d51ac"). InnerVolumeSpecName "kube-api-access-tl8xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.543753 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-util" (OuterVolumeSpecName: "util") pod "5aa35519-bdc4-4eb7-a039-7238829d51ac" (UID: "5aa35519-bdc4-4eb7-a039-7238829d51ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.579337 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2af749ee-af7b-49a4-9b1a-6058a96a7bc5" (UID: "2af749ee-af7b-49a4-9b1a-6058a96a7bc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.632953 4553 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa35519-bdc4-4eb7-a039-7238829d51ac-util\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.632987 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af749ee-af7b-49a4-9b1a-6058a96a7bc5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.633001 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl8xb\" (UniqueName: \"kubernetes.io/projected/5aa35519-bdc4-4eb7-a039-7238829d51ac-kube-api-access-tl8xb\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.661089 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vlsw5"] Sep 30 19:45:23 crc kubenswrapper[4553]: I0930 19:45:23.664078 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vlsw5"] Sep 30 19:45:24 crc kubenswrapper[4553]: I0930 19:45:24.040926 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" event={"ID":"5aa35519-bdc4-4eb7-a039-7238829d51ac","Type":"ContainerDied","Data":"c77e3fe95b7ffbe0a4db521cbb82f1991b66d7ec95b38127b8452b2170a19f84"} Sep 30 19:45:24 crc kubenswrapper[4553]: I0930 19:45:24.040977 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c77e3fe95b7ffbe0a4db521cbb82f1991b66d7ec95b38127b8452b2170a19f84" Sep 30 19:45:24 crc kubenswrapper[4553]: I0930 19:45:24.041118 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj" Sep 30 19:45:25 crc kubenswrapper[4553]: I0930 19:45:25.518914 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" path="/var/lib/kubelet/pods/2af749ee-af7b-49a4-9b1a-6058a96a7bc5/volumes" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.029897 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg"] Sep 30 19:45:29 crc kubenswrapper[4553]: E0930 19:45:29.030569 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerName="extract-utilities" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.030582 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerName="extract-utilities" Sep 30 19:45:29 crc kubenswrapper[4553]: E0930 19:45:29.030593 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerName="pull" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.030600 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerName="pull" Sep 30 19:45:29 crc kubenswrapper[4553]: E0930 19:45:29.030611 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerName="util" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.030618 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerName="util" Sep 30 19:45:29 crc kubenswrapper[4553]: E0930 19:45:29.030626 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerName="extract" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.030632 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerName="extract" Sep 30 19:45:29 crc kubenswrapper[4553]: E0930 19:45:29.030642 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerName="extract-content" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.030648 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerName="extract-content" Sep 30 19:45:29 crc kubenswrapper[4553]: E0930 19:45:29.030656 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerName="registry-server" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.030662 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerName="registry-server" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.030755 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aa35519-bdc4-4eb7-a039-7238829d51ac" containerName="extract" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.030772 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af749ee-af7b-49a4-9b1a-6058a96a7bc5" containerName="registry-server" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.031304 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.034435 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-62rvh" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.060897 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg"] Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.211396 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqdn4\" (UniqueName: \"kubernetes.io/projected/7c599bbe-1b64-4563-b235-5f2c58d234b5-kube-api-access-cqdn4\") pod \"openstack-operator-controller-operator-67dd46bc9f-2kzlg\" (UID: \"7c599bbe-1b64-4563-b235-5f2c58d234b5\") " pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.317431 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqdn4\" (UniqueName: \"kubernetes.io/projected/7c599bbe-1b64-4563-b235-5f2c58d234b5-kube-api-access-cqdn4\") pod \"openstack-operator-controller-operator-67dd46bc9f-2kzlg\" (UID: \"7c599bbe-1b64-4563-b235-5f2c58d234b5\") " pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.363436 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqdn4\" (UniqueName: \"kubernetes.io/projected/7c599bbe-1b64-4563-b235-5f2c58d234b5-kube-api-access-cqdn4\") pod \"openstack-operator-controller-operator-67dd46bc9f-2kzlg\" (UID: \"7c599bbe-1b64-4563-b235-5f2c58d234b5\") " pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.585572 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.585624 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.585667 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.586207 4553 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d0705cac6e5b952d02766c3f1729599066280437bbe55ec8f4688736bf24a4f"} pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.586286 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" containerID="cri-o://4d0705cac6e5b952d02766c3f1729599066280437bbe55ec8f4688736bf24a4f" gracePeriod=600 Sep 30 19:45:29 crc kubenswrapper[4553]: I0930 19:45:29.646548 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" Sep 30 19:45:30 crc kubenswrapper[4553]: I0930 19:45:30.094483 4553 generic.go:334] "Generic (PLEG): container finished" podID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerID="4d0705cac6e5b952d02766c3f1729599066280437bbe55ec8f4688736bf24a4f" exitCode=0 Sep 30 19:45:30 crc kubenswrapper[4553]: I0930 19:45:30.094728 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerDied","Data":"4d0705cac6e5b952d02766c3f1729599066280437bbe55ec8f4688736bf24a4f"} Sep 30 19:45:30 crc kubenswrapper[4553]: I0930 19:45:30.094755 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"6c53001a48c79a1addca634bfcf9ef4be43fc5d44c498f0ba986c32047fcaed3"} Sep 30 19:45:30 crc kubenswrapper[4553]: I0930 19:45:30.094771 4553 scope.go:117] "RemoveContainer" containerID="51154b57f12370c60080e989e52d35722515976fa625d655a2c4cbbb683003ca" Sep 30 19:45:30 crc kubenswrapper[4553]: I0930 19:45:30.160170 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg"] Sep 30 19:45:30 crc kubenswrapper[4553]: W0930 19:45:30.172550 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c599bbe_1b64_4563_b235_5f2c58d234b5.slice/crio-cfe70344b1bae497e55f042ace6d501c22421e74012ef6b240fd07b21d94cc0e WatchSource:0}: Error finding container cfe70344b1bae497e55f042ace6d501c22421e74012ef6b240fd07b21d94cc0e: Status 404 returned error can't find the container with id cfe70344b1bae497e55f042ace6d501c22421e74012ef6b240fd07b21d94cc0e Sep 30 19:45:31 crc kubenswrapper[4553]: I0930 19:45:31.104195 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" event={"ID":"7c599bbe-1b64-4563-b235-5f2c58d234b5","Type":"ContainerStarted","Data":"cfe70344b1bae497e55f042ace6d501c22421e74012ef6b240fd07b21d94cc0e"} Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.130807 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" event={"ID":"7c599bbe-1b64-4563-b235-5f2c58d234b5","Type":"ContainerStarted","Data":"5f74cfd00f0bc1781b5482f8d9e2a04aa9337cc6dd27fe2cac23b6c77490563a"} Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.232789 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jd8t2"] Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.233971 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.246542 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jd8t2"] Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.325602 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-utilities\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.325924 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-catalog-content\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.325949 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppwg2\" (UniqueName: \"kubernetes.io/projected/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-kube-api-access-ppwg2\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.427251 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-catalog-content\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.427308 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppwg2\" (UniqueName: \"kubernetes.io/projected/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-kube-api-access-ppwg2\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.427354 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-utilities\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.427883 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-utilities\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.427881 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-catalog-content\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.450126 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppwg2\" (UniqueName: \"kubernetes.io/projected/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-kube-api-access-ppwg2\") pod \"certified-operators-jd8t2\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:35 crc kubenswrapper[4553]: I0930 19:45:35.583539 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:36 crc kubenswrapper[4553]: I0930 19:45:36.457709 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jd8t2"] Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.035546 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6m559"] Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.036836 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.047812 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m559"] Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.051478 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-utilities\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.051613 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hvz5\" (UniqueName: \"kubernetes.io/projected/d4ac1af6-643f-4463-a0fa-ae6000372b2b-kube-api-access-2hvz5\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.051698 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-catalog-content\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.153148 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-catalog-content\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.153562 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-utilities\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.153701 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hvz5\" (UniqueName: \"kubernetes.io/projected/d4ac1af6-643f-4463-a0fa-ae6000372b2b-kube-api-access-2hvz5\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.153721 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-catalog-content\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.153972 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-utilities\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.154849 4553 generic.go:334] "Generic (PLEG): container finished" podID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerID="cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d" exitCode=0 Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.154941 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8t2" event={"ID":"4a55b8cf-ca6b-4b9c-b58e-983421a5e689","Type":"ContainerDied","Data":"cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d"} Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.154983 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8t2" event={"ID":"4a55b8cf-ca6b-4b9c-b58e-983421a5e689","Type":"ContainerStarted","Data":"82245943b0af9dc236a28489117d70c46a4bd67d7047a18b4258f8d842c0e94a"} Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.157718 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" event={"ID":"7c599bbe-1b64-4563-b235-5f2c58d234b5","Type":"ContainerStarted","Data":"19c015c73fddd161a5c1ad7fb22cf34873efc57b6d1fed6440f34bcbdc8d624c"} Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.158268 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.172053 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hvz5\" (UniqueName: \"kubernetes.io/projected/d4ac1af6-643f-4463-a0fa-ae6000372b2b-kube-api-access-2hvz5\") pod \"redhat-marketplace-6m559\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.202887 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" podStartSLOduration=1.4235326320000001 podStartE2EDuration="8.202870026s" podCreationTimestamp="2025-09-30 19:45:29 +0000 UTC" firstStartedPulling="2025-09-30 19:45:30.174415142 +0000 UTC m=+783.373917272" lastFinishedPulling="2025-09-30 19:45:36.953752536 +0000 UTC m=+790.153254666" observedRunningTime="2025-09-30 19:45:37.198453588 +0000 UTC m=+790.397955718" watchObservedRunningTime="2025-09-30 19:45:37.202870026 +0000 UTC m=+790.402372156" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.369774 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:37 crc kubenswrapper[4553]: I0930 19:45:37.870281 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m559"] Sep 30 19:45:38 crc kubenswrapper[4553]: I0930 19:45:38.167030 4553 generic.go:334] "Generic (PLEG): container finished" podID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerID="0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c" exitCode=0 Sep 30 19:45:38 crc kubenswrapper[4553]: I0930 19:45:38.167082 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8t2" event={"ID":"4a55b8cf-ca6b-4b9c-b58e-983421a5e689","Type":"ContainerDied","Data":"0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c"} Sep 30 19:45:38 crc kubenswrapper[4553]: I0930 19:45:38.171390 4553 generic.go:334] "Generic (PLEG): container finished" podID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerID="a30cc8b1485ef149e571ffbc310ce72e550c7ab05ae149527c68c944373ee7f2" exitCode=0 Sep 30 19:45:38 crc kubenswrapper[4553]: I0930 19:45:38.171471 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m559" event={"ID":"d4ac1af6-643f-4463-a0fa-ae6000372b2b","Type":"ContainerDied","Data":"a30cc8b1485ef149e571ffbc310ce72e550c7ab05ae149527c68c944373ee7f2"} Sep 30 19:45:38 crc kubenswrapper[4553]: I0930 19:45:38.171498 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m559" event={"ID":"d4ac1af6-643f-4463-a0fa-ae6000372b2b","Type":"ContainerStarted","Data":"34ab6a54bdaba344f45bcae0af6ead05b0a02838cc4f26404d665605e0edfc62"} Sep 30 19:45:39 crc kubenswrapper[4553]: I0930 19:45:39.183999 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8t2" event={"ID":"4a55b8cf-ca6b-4b9c-b58e-983421a5e689","Type":"ContainerStarted","Data":"a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc"} Sep 30 19:45:39 crc kubenswrapper[4553]: I0930 19:45:39.188429 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-67dd46bc9f-2kzlg" Sep 30 19:45:39 crc kubenswrapper[4553]: I0930 19:45:39.233877 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jd8t2" podStartSLOduration=2.686565819 podStartE2EDuration="4.233862935s" podCreationTimestamp="2025-09-30 19:45:35 +0000 UTC" firstStartedPulling="2025-09-30 19:45:37.156154498 +0000 UTC m=+790.355656618" lastFinishedPulling="2025-09-30 19:45:38.703451604 +0000 UTC m=+791.902953734" observedRunningTime="2025-09-30 19:45:39.204346546 +0000 UTC m=+792.403848676" watchObservedRunningTime="2025-09-30 19:45:39.233862935 +0000 UTC m=+792.433365065" Sep 30 19:45:40 crc kubenswrapper[4553]: I0930 19:45:40.190769 4553 generic.go:334] "Generic (PLEG): container finished" podID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerID="adb84422d9e32914536dc2b1d046ad0526b6273884d3718791763dd9d6cd2085" exitCode=0 Sep 30 19:45:40 crc kubenswrapper[4553]: I0930 19:45:40.190952 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m559" event={"ID":"d4ac1af6-643f-4463-a0fa-ae6000372b2b","Type":"ContainerDied","Data":"adb84422d9e32914536dc2b1d046ad0526b6273884d3718791763dd9d6cd2085"} Sep 30 19:45:41 crc kubenswrapper[4553]: I0930 19:45:41.199986 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m559" event={"ID":"d4ac1af6-643f-4463-a0fa-ae6000372b2b","Type":"ContainerStarted","Data":"85916b7d4d202925061c214be18ee7d727e7a17307591acb9ba1556426b622dc"} Sep 30 19:45:41 crc kubenswrapper[4553]: I0930 19:45:41.222634 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6m559" podStartSLOduration=1.8039317910000001 podStartE2EDuration="4.222614164s" podCreationTimestamp="2025-09-30 19:45:37 +0000 UTC" firstStartedPulling="2025-09-30 19:45:38.173026713 +0000 UTC m=+791.372528883" lastFinishedPulling="2025-09-30 19:45:40.591709126 +0000 UTC m=+793.791211256" observedRunningTime="2025-09-30 19:45:41.220520848 +0000 UTC m=+794.420022998" watchObservedRunningTime="2025-09-30 19:45:41.222614164 +0000 UTC m=+794.422116294" Sep 30 19:45:45 crc kubenswrapper[4553]: I0930 19:45:45.584505 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:45 crc kubenswrapper[4553]: I0930 19:45:45.585809 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:45 crc kubenswrapper[4553]: I0930 19:45:45.640716 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:46 crc kubenswrapper[4553]: I0930 19:45:46.274694 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:46 crc kubenswrapper[4553]: I0930 19:45:46.641681 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jd8t2"] Sep 30 19:45:47 crc kubenswrapper[4553]: I0930 19:45:47.370878 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:47 crc kubenswrapper[4553]: I0930 19:45:47.371356 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:47 crc kubenswrapper[4553]: I0930 19:45:47.439181 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.235610 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jd8t2" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerName="registry-server" containerID="cri-o://a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc" gracePeriod=2 Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.349142 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.824637 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.914984 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-catalog-content\") pod \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.915151 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-utilities\") pod \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.915257 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppwg2\" (UniqueName: \"kubernetes.io/projected/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-kube-api-access-ppwg2\") pod \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\" (UID: \"4a55b8cf-ca6b-4b9c-b58e-983421a5e689\") " Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.916114 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-utilities" (OuterVolumeSpecName: "utilities") pod "4a55b8cf-ca6b-4b9c-b58e-983421a5e689" (UID: "4a55b8cf-ca6b-4b9c-b58e-983421a5e689"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.934972 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-kube-api-access-ppwg2" (OuterVolumeSpecName: "kube-api-access-ppwg2") pod "4a55b8cf-ca6b-4b9c-b58e-983421a5e689" (UID: "4a55b8cf-ca6b-4b9c-b58e-983421a5e689"). InnerVolumeSpecName "kube-api-access-ppwg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:45:48 crc kubenswrapper[4553]: I0930 19:45:48.966487 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a55b8cf-ca6b-4b9c-b58e-983421a5e689" (UID: "4a55b8cf-ca6b-4b9c-b58e-983421a5e689"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.016940 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.016982 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.017003 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppwg2\" (UniqueName: \"kubernetes.io/projected/4a55b8cf-ca6b-4b9c-b58e-983421a5e689-kube-api-access-ppwg2\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.243297 4553 generic.go:334] "Generic (PLEG): container finished" podID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerID="a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc" exitCode=0 Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.243370 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jd8t2" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.243413 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8t2" event={"ID":"4a55b8cf-ca6b-4b9c-b58e-983421a5e689","Type":"ContainerDied","Data":"a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc"} Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.243443 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jd8t2" event={"ID":"4a55b8cf-ca6b-4b9c-b58e-983421a5e689","Type":"ContainerDied","Data":"82245943b0af9dc236a28489117d70c46a4bd67d7047a18b4258f8d842c0e94a"} Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.243467 4553 scope.go:117] "RemoveContainer" containerID="a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.261119 4553 scope.go:117] "RemoveContainer" containerID="0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.286224 4553 scope.go:117] "RemoveContainer" containerID="cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.294706 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jd8t2"] Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.306563 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jd8t2"] Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.307165 4553 scope.go:117] "RemoveContainer" containerID="a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc" Sep 30 19:45:49 crc kubenswrapper[4553]: E0930 19:45:49.307598 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc\": container with ID starting with a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc not found: ID does not exist" containerID="a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.307635 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc"} err="failed to get container status \"a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc\": rpc error: code = NotFound desc = could not find container \"a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc\": container with ID starting with a2543eb57820b8845c71424478bbe93b9737ed2ed90ef36213a60f53bc78edfc not found: ID does not exist" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.307655 4553 scope.go:117] "RemoveContainer" containerID="0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c" Sep 30 19:45:49 crc kubenswrapper[4553]: E0930 19:45:49.309483 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c\": container with ID starting with 0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c not found: ID does not exist" containerID="0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.309512 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c"} err="failed to get container status \"0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c\": rpc error: code = NotFound desc = could not find container \"0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c\": container with ID starting with 0ff34e33ceeabc7a122d825fa9ba2fc3142e33212c48d2b1864af488f97cae8c not found: ID does not exist" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.309528 4553 scope.go:117] "RemoveContainer" containerID="cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d" Sep 30 19:45:49 crc kubenswrapper[4553]: E0930 19:45:49.309770 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d\": container with ID starting with cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d not found: ID does not exist" containerID="cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.309791 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d"} err="failed to get container status \"cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d\": rpc error: code = NotFound desc = could not find container \"cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d\": container with ID starting with cf7f7d49bf2cad83adebe490a4137ca5397b17fbdc5b2c7fd87fbae473fc5b0d not found: ID does not exist" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.511141 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" path="/var/lib/kubelet/pods/4a55b8cf-ca6b-4b9c-b58e-983421a5e689/volumes" Sep 30 19:45:49 crc kubenswrapper[4553]: I0930 19:45:49.825962 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m559"] Sep 30 19:45:50 crc kubenswrapper[4553]: I0930 19:45:50.250070 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6m559" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerName="registry-server" containerID="cri-o://85916b7d4d202925061c214be18ee7d727e7a17307591acb9ba1556426b622dc" gracePeriod=2 Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.273735 4553 generic.go:334] "Generic (PLEG): container finished" podID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerID="85916b7d4d202925061c214be18ee7d727e7a17307591acb9ba1556426b622dc" exitCode=0 Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.273959 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m559" event={"ID":"d4ac1af6-643f-4463-a0fa-ae6000372b2b","Type":"ContainerDied","Data":"85916b7d4d202925061c214be18ee7d727e7a17307591acb9ba1556426b622dc"} Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.587497 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.748868 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hvz5\" (UniqueName: \"kubernetes.io/projected/d4ac1af6-643f-4463-a0fa-ae6000372b2b-kube-api-access-2hvz5\") pod \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.748950 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-utilities\") pod \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.748985 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-catalog-content\") pod \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\" (UID: \"d4ac1af6-643f-4463-a0fa-ae6000372b2b\") " Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.749708 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-utilities" (OuterVolumeSpecName: "utilities") pod "d4ac1af6-643f-4463-a0fa-ae6000372b2b" (UID: "d4ac1af6-643f-4463-a0fa-ae6000372b2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.758226 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ac1af6-643f-4463-a0fa-ae6000372b2b-kube-api-access-2hvz5" (OuterVolumeSpecName: "kube-api-access-2hvz5") pod "d4ac1af6-643f-4463-a0fa-ae6000372b2b" (UID: "d4ac1af6-643f-4463-a0fa-ae6000372b2b"). InnerVolumeSpecName "kube-api-access-2hvz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.762477 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4ac1af6-643f-4463-a0fa-ae6000372b2b" (UID: "d4ac1af6-643f-4463-a0fa-ae6000372b2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.850524 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hvz5\" (UniqueName: \"kubernetes.io/projected/d4ac1af6-643f-4463-a0fa-ae6000372b2b-kube-api-access-2hvz5\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.850766 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:51 crc kubenswrapper[4553]: I0930 19:45:51.850840 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4ac1af6-643f-4463-a0fa-ae6000372b2b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:45:52 crc kubenswrapper[4553]: I0930 19:45:52.295785 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6m559" event={"ID":"d4ac1af6-643f-4463-a0fa-ae6000372b2b","Type":"ContainerDied","Data":"34ab6a54bdaba344f45bcae0af6ead05b0a02838cc4f26404d665605e0edfc62"} Sep 30 19:45:52 crc kubenswrapper[4553]: I0930 19:45:52.295858 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6m559" Sep 30 19:45:52 crc kubenswrapper[4553]: I0930 19:45:52.296216 4553 scope.go:117] "RemoveContainer" containerID="85916b7d4d202925061c214be18ee7d727e7a17307591acb9ba1556426b622dc" Sep 30 19:45:52 crc kubenswrapper[4553]: I0930 19:45:52.339867 4553 scope.go:117] "RemoveContainer" containerID="adb84422d9e32914536dc2b1d046ad0526b6273884d3718791763dd9d6cd2085" Sep 30 19:45:52 crc kubenswrapper[4553]: I0930 19:45:52.359639 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m559"] Sep 30 19:45:52 crc kubenswrapper[4553]: I0930 19:45:52.366372 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6m559"] Sep 30 19:45:52 crc kubenswrapper[4553]: I0930 19:45:52.387396 4553 scope.go:117] "RemoveContainer" containerID="a30cc8b1485ef149e571ffbc310ce72e550c7ab05ae149527c68c944373ee7f2" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.464965 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk"] Sep 30 19:45:53 crc kubenswrapper[4553]: E0930 19:45:53.465227 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerName="extract-content" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465240 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerName="extract-content" Sep 30 19:45:53 crc kubenswrapper[4553]: E0930 19:45:53.465259 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerName="extract-content" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465265 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerName="extract-content" Sep 30 19:45:53 crc kubenswrapper[4553]: E0930 19:45:53.465275 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerName="registry-server" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465281 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerName="registry-server" Sep 30 19:45:53 crc kubenswrapper[4553]: E0930 19:45:53.465289 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerName="registry-server" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465295 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerName="registry-server" Sep 30 19:45:53 crc kubenswrapper[4553]: E0930 19:45:53.465302 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerName="extract-utilities" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465308 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerName="extract-utilities" Sep 30 19:45:53 crc kubenswrapper[4553]: E0930 19:45:53.465316 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerName="extract-utilities" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465321 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerName="extract-utilities" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465440 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" containerName="registry-server" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465449 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a55b8cf-ca6b-4b9c-b58e-983421a5e689" containerName="registry-server" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.465999 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.468931 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9tp4j" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.472652 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.488658 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.489902 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.491670 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.492663 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.493860 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-n8jqx" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.495725 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-4phjh" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.518262 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ac1af6-643f-4463-a0fa-ae6000372b2b" path="/var/lib/kubelet/pods/d4ac1af6-643f-4463-a0fa-ae6000372b2b/volumes" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.540887 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.558598 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.559681 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.566659 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.567612 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.570328 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mxdbz" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.570709 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-tg48b" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.571320 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.572655 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddn7d\" (UniqueName: \"kubernetes.io/projected/aebfd6cd-5a72-4797-b16a-492efaa1016e-kube-api-access-ddn7d\") pod \"barbican-operator-controller-manager-6ff8b75857-r2zwk\" (UID: \"aebfd6cd-5a72-4797-b16a-492efaa1016e\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.572688 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb95s\" (UniqueName: \"kubernetes.io/projected/a38189f9-08d1-4f4b-8949-4856b0f46d95-kube-api-access-wb95s\") pod \"cinder-operator-controller-manager-644bddb6d8-r7vxh\" (UID: \"a38189f9-08d1-4f4b-8949-4856b0f46d95\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.572763 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n228h\" (UniqueName: \"kubernetes.io/projected/485f984b-4520-4753-b6e7-4584137d3d58-kube-api-access-n228h\") pod \"designate-operator-controller-manager-84f4f7b77b-5k8k5\" (UID: \"485f984b-4520-4753-b6e7-4584137d3d58\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.586564 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.587735 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.589582 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mcg44" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.655148 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.683653 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddn7d\" (UniqueName: \"kubernetes.io/projected/aebfd6cd-5a72-4797-b16a-492efaa1016e-kube-api-access-ddn7d\") pod \"barbican-operator-controller-manager-6ff8b75857-r2zwk\" (UID: \"aebfd6cd-5a72-4797-b16a-492efaa1016e\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.683700 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb95s\" (UniqueName: \"kubernetes.io/projected/a38189f9-08d1-4f4b-8949-4856b0f46d95-kube-api-access-wb95s\") pod \"cinder-operator-controller-manager-644bddb6d8-r7vxh\" (UID: \"a38189f9-08d1-4f4b-8949-4856b0f46d95\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.683729 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2brj\" (UniqueName: \"kubernetes.io/projected/a0036f14-fb94-4336-9e0b-d501cd080bd5-kube-api-access-c2brj\") pod \"glance-operator-controller-manager-84958c4d49-xk7kj\" (UID: \"a0036f14-fb94-4336-9e0b-d501cd080bd5\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.683765 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tt2g\" (UniqueName: \"kubernetes.io/projected/2339ca48-ee02-4443-a1fd-4ae2456f6569-kube-api-access-2tt2g\") pod \"heat-operator-controller-manager-5d889d78cf-ghxx6\" (UID: \"2339ca48-ee02-4443-a1fd-4ae2456f6569\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.683819 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n228h\" (UniqueName: \"kubernetes.io/projected/485f984b-4520-4753-b6e7-4584137d3d58-kube-api-access-n228h\") pod \"designate-operator-controller-manager-84f4f7b77b-5k8k5\" (UID: \"485f984b-4520-4753-b6e7-4584137d3d58\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.683862 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fslc\" (UniqueName: \"kubernetes.io/projected/fd82e0b0-7700-49a2-9a07-2695b2ffe2fc-kube-api-access-4fslc\") pod \"horizon-operator-controller-manager-9f4696d94-9bbsc\" (UID: \"fd82e0b0-7700-49a2-9a07-2695b2ffe2fc\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.684599 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.685811 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.688414 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.692372 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-r4xkd" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.697034 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.704156 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.712391 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.713404 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.715966 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.729661 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-b5nt9" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.733874 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddn7d\" (UniqueName: \"kubernetes.io/projected/aebfd6cd-5a72-4797-b16a-492efaa1016e-kube-api-access-ddn7d\") pod \"barbican-operator-controller-manager-6ff8b75857-r2zwk\" (UID: \"aebfd6cd-5a72-4797-b16a-492efaa1016e\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.735674 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb95s\" (UniqueName: \"kubernetes.io/projected/a38189f9-08d1-4f4b-8949-4856b0f46d95-kube-api-access-wb95s\") pod \"cinder-operator-controller-manager-644bddb6d8-r7vxh\" (UID: \"a38189f9-08d1-4f4b-8949-4856b0f46d95\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.744085 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.747651 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n228h\" (UniqueName: \"kubernetes.io/projected/485f984b-4520-4753-b6e7-4584137d3d58-kube-api-access-n228h\") pod \"designate-operator-controller-manager-84f4f7b77b-5k8k5\" (UID: \"485f984b-4520-4753-b6e7-4584137d3d58\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.755467 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.756460 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.759424 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-9hktj" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.761217 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.762332 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.767682 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-92pb9" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.780390 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.793299 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2brj\" (UniqueName: \"kubernetes.io/projected/a0036f14-fb94-4336-9e0b-d501cd080bd5-kube-api-access-c2brj\") pod \"glance-operator-controller-manager-84958c4d49-xk7kj\" (UID: \"a0036f14-fb94-4336-9e0b-d501cd080bd5\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.793388 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tt2g\" (UniqueName: \"kubernetes.io/projected/2339ca48-ee02-4443-a1fd-4ae2456f6569-kube-api-access-2tt2g\") pod \"heat-operator-controller-manager-5d889d78cf-ghxx6\" (UID: \"2339ca48-ee02-4443-a1fd-4ae2456f6569\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.793484 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd48r\" (UniqueName: \"kubernetes.io/projected/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-kube-api-access-nd48r\") pod \"infra-operator-controller-manager-9d6c5db85-tk4kk\" (UID: \"1b3e5dca-afd2-42de-a39a-e4e6fda92e90\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.793506 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert\") pod \"infra-operator-controller-manager-9d6c5db85-tk4kk\" (UID: \"1b3e5dca-afd2-42de-a39a-e4e6fda92e90\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.793582 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fslc\" (UniqueName: \"kubernetes.io/projected/fd82e0b0-7700-49a2-9a07-2695b2ffe2fc-kube-api-access-4fslc\") pod \"horizon-operator-controller-manager-9f4696d94-9bbsc\" (UID: \"fd82e0b0-7700-49a2-9a07-2695b2ffe2fc\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.819297 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.832688 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tt2g\" (UniqueName: \"kubernetes.io/projected/2339ca48-ee02-4443-a1fd-4ae2456f6569-kube-api-access-2tt2g\") pod \"heat-operator-controller-manager-5d889d78cf-ghxx6\" (UID: \"2339ca48-ee02-4443-a1fd-4ae2456f6569\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.833117 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fslc\" (UniqueName: \"kubernetes.io/projected/fd82e0b0-7700-49a2-9a07-2695b2ffe2fc-kube-api-access-4fslc\") pod \"horizon-operator-controller-manager-9f4696d94-9bbsc\" (UID: \"fd82e0b0-7700-49a2-9a07-2695b2ffe2fc\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.834405 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.834667 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2brj\" (UniqueName: \"kubernetes.io/projected/a0036f14-fb94-4336-9e0b-d501cd080bd5-kube-api-access-c2brj\") pod \"glance-operator-controller-manager-84958c4d49-xk7kj\" (UID: \"a0036f14-fb94-4336-9e0b-d501cd080bd5\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.842430 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.844756 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.878894 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.884993 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.898635 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.899676 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvssh\" (UniqueName: \"kubernetes.io/projected/57489c87-d763-4cf2-a2c6-fd03b1ec7131-kube-api-access-hvssh\") pod \"ironic-operator-controller-manager-5cd4858477-s8nwm\" (UID: \"57489c87-d763-4cf2-a2c6-fd03b1ec7131\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.899723 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz6kv\" (UniqueName: \"kubernetes.io/projected/48e38a70-37cf-4efe-bac1-e0fe7b196b22-kube-api-access-bz6kv\") pod \"manila-operator-controller-manager-6d68dbc695-4sf9s\" (UID: \"48e38a70-37cf-4efe-bac1-e0fe7b196b22\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.899777 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2kz5\" (UniqueName: \"kubernetes.io/projected/e973f7e5-4256-4b75-8f51-e01ca131eeca-kube-api-access-q2kz5\") pod \"keystone-operator-controller-manager-5bd55b4bff-sjwqp\" (UID: \"e973f7e5-4256-4b75-8f51-e01ca131eeca\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.899801 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert\") pod \"infra-operator-controller-manager-9d6c5db85-tk4kk\" (UID: \"1b3e5dca-afd2-42de-a39a-e4e6fda92e90\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.899819 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd48r\" (UniqueName: \"kubernetes.io/projected/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-kube-api-access-nd48r\") pod \"infra-operator-controller-manager-9d6c5db85-tk4kk\" (UID: \"1b3e5dca-afd2-42de-a39a-e4e6fda92e90\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.899969 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" Sep 30 19:45:53 crc kubenswrapper[4553]: E0930 19:45:53.905755 4553 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 19:45:53 crc kubenswrapper[4553]: E0930 19:45:53.905816 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert podName:1b3e5dca-afd2-42de-a39a-e4e6fda92e90 nodeName:}" failed. No retries permitted until 2025-09-30 19:45:54.405799397 +0000 UTC m=+807.605301527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert") pod "infra-operator-controller-manager-9d6c5db85-tk4kk" (UID: "1b3e5dca-afd2-42de-a39a-e4e6fda92e90") : secret "infra-operator-webhook-server-cert" not found Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.911576 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.913176 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.913919 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rblfv" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.919010 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.923211 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.924189 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.927254 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-g9rbn" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.931157 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rr4b9" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.950695 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.957727 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd48r\" (UniqueName: \"kubernetes.io/projected/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-kube-api-access-nd48r\") pod \"infra-operator-controller-manager-9d6c5db85-tk4kk\" (UID: \"1b3e5dca-afd2-42de-a39a-e4e6fda92e90\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.975076 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25"] Sep 30 19:45:53 crc kubenswrapper[4553]: I0930 19:45:53.990448 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.003665 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr78p\" (UniqueName: \"kubernetes.io/projected/081051c3-9106-4b8f-8850-42facfbb5583-kube-api-access-nr78p\") pod \"mariadb-operator-controller-manager-88c7-qhv4n\" (UID: \"081051c3-9106-4b8f-8850-42facfbb5583\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.003714 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2kz5\" (UniqueName: \"kubernetes.io/projected/e973f7e5-4256-4b75-8f51-e01ca131eeca-kube-api-access-q2kz5\") pod \"keystone-operator-controller-manager-5bd55b4bff-sjwqp\" (UID: \"e973f7e5-4256-4b75-8f51-e01ca131eeca\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.003785 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvssh\" (UniqueName: \"kubernetes.io/projected/57489c87-d763-4cf2-a2c6-fd03b1ec7131-kube-api-access-hvssh\") pod \"ironic-operator-controller-manager-5cd4858477-s8nwm\" (UID: \"57489c87-d763-4cf2-a2c6-fd03b1ec7131\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.003810 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz6kv\" (UniqueName: \"kubernetes.io/projected/48e38a70-37cf-4efe-bac1-e0fe7b196b22-kube-api-access-bz6kv\") pod \"manila-operator-controller-manager-6d68dbc695-4sf9s\" (UID: \"48e38a70-37cf-4efe-bac1-e0fe7b196b22\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.003830 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59n22\" (UniqueName: \"kubernetes.io/projected/e6fe293b-17b2-40c1-ac31-e456a23355b9-kube-api-access-59n22\") pod \"nova-operator-controller-manager-64cd67b5cb-5lc25\" (UID: \"e6fe293b-17b2-40c1-ac31-e456a23355b9\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.006891 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.011349 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.012333 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.013635 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4b2kw" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.026549 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.029968 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fng2r" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.037105 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.038889 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvssh\" (UniqueName: \"kubernetes.io/projected/57489c87-d763-4cf2-a2c6-fd03b1ec7131-kube-api-access-hvssh\") pod \"ironic-operator-controller-manager-5cd4858477-s8nwm\" (UID: \"57489c87-d763-4cf2-a2c6-fd03b1ec7131\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.039363 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2kz5\" (UniqueName: \"kubernetes.io/projected/e973f7e5-4256-4b75-8f51-e01ca131eeca-kube-api-access-q2kz5\") pod \"keystone-operator-controller-manager-5bd55b4bff-sjwqp\" (UID: \"e973f7e5-4256-4b75-8f51-e01ca131eeca\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.043842 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.050271 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.053619 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.060631 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz6kv\" (UniqueName: \"kubernetes.io/projected/48e38a70-37cf-4efe-bac1-e0fe7b196b22-kube-api-access-bz6kv\") pod \"manila-operator-controller-manager-6d68dbc695-4sf9s\" (UID: \"48e38a70-37cf-4efe-bac1-e0fe7b196b22\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.064790 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dbqmb" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.064978 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.068081 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-grf94"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.069150 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.070481 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-t5hb5" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.076330 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.100386 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.104666 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.106014 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr78p\" (UniqueName: \"kubernetes.io/projected/081051c3-9106-4b8f-8850-42facfbb5583-kube-api-access-nr78p\") pod \"mariadb-operator-controller-manager-88c7-qhv4n\" (UID: \"081051c3-9106-4b8f-8850-42facfbb5583\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.106103 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9r2m\" (UniqueName: \"kubernetes.io/projected/6d78a774-042e-4b7b-9988-971454080ca0-kube-api-access-l9r2m\") pod \"ovn-operator-controller-manager-9976ff44c-9b2qm\" (UID: \"6d78a774-042e-4b7b-9988-971454080ca0\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.106148 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw2tc\" (UniqueName: \"kubernetes.io/projected/fad8e76f-5b93-44c1-98d2-3f6f756cc23c-kube-api-access-sw2tc\") pod \"neutron-operator-controller-manager-849d5b9b84-jl46p\" (UID: \"fad8e76f-5b93-44c1-98d2-3f6f756cc23c\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.106169 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf7hw\" (UniqueName: \"kubernetes.io/projected/11181f5a-47aa-4d9b-b3eb-b6c5868bed4b-kube-api-access-hf7hw\") pod \"octavia-operator-controller-manager-7b787867f4-lgc4h\" (UID: \"11181f5a-47aa-4d9b-b3eb-b6c5868bed4b\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.106202 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59n22\" (UniqueName: \"kubernetes.io/projected/e6fe293b-17b2-40c1-ac31-e456a23355b9-kube-api-access-59n22\") pod \"nova-operator-controller-manager-64cd67b5cb-5lc25\" (UID: \"e6fe293b-17b2-40c1-ac31-e456a23355b9\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.138024 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.139858 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.141667 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr78p\" (UniqueName: \"kubernetes.io/projected/081051c3-9106-4b8f-8850-42facfbb5583-kube-api-access-nr78p\") pod \"mariadb-operator-controller-manager-88c7-qhv4n\" (UID: \"081051c3-9106-4b8f-8850-42facfbb5583\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.143682 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59n22\" (UniqueName: \"kubernetes.io/projected/e6fe293b-17b2-40c1-ac31-e456a23355b9-kube-api-access-59n22\") pod \"nova-operator-controller-manager-64cd67b5cb-5lc25\" (UID: \"e6fe293b-17b2-40c1-ac31-e456a23355b9\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.143985 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-v9wqw" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.146343 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-grf94"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.162545 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.215343 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.215696 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1179b9-fc96-402c-9387-7fb33c26a489-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8crsqm8\" (UID: \"2f1179b9-fc96-402c-9387-7fb33c26a489\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.215731 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdgs\" (UniqueName: \"kubernetes.io/projected/811049e5-2659-408a-9370-77fe827766e1-kube-api-access-qgdgs\") pod \"placement-operator-controller-manager-589c58c6c-grf94\" (UID: \"811049e5-2659-408a-9370-77fe827766e1\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.215769 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw2tc\" (UniqueName: \"kubernetes.io/projected/fad8e76f-5b93-44c1-98d2-3f6f756cc23c-kube-api-access-sw2tc\") pod \"neutron-operator-controller-manager-849d5b9b84-jl46p\" (UID: \"fad8e76f-5b93-44c1-98d2-3f6f756cc23c\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.215796 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf7hw\" (UniqueName: \"kubernetes.io/projected/11181f5a-47aa-4d9b-b3eb-b6c5868bed4b-kube-api-access-hf7hw\") pod \"octavia-operator-controller-manager-7b787867f4-lgc4h\" (UID: \"11181f5a-47aa-4d9b-b3eb-b6c5868bed4b\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.215888 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvwvw\" (UniqueName: \"kubernetes.io/projected/2f1179b9-fc96-402c-9387-7fb33c26a489-kube-api-access-wvwvw\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8crsqm8\" (UID: \"2f1179b9-fc96-402c-9387-7fb33c26a489\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.215998 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9r2m\" (UniqueName: \"kubernetes.io/projected/6d78a774-042e-4b7b-9988-971454080ca0-kube-api-access-l9r2m\") pod \"ovn-operator-controller-manager-9976ff44c-9b2qm\" (UID: \"6d78a774-042e-4b7b-9988-971454080ca0\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.250606 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf7hw\" (UniqueName: \"kubernetes.io/projected/11181f5a-47aa-4d9b-b3eb-b6c5868bed4b-kube-api-access-hf7hw\") pod \"octavia-operator-controller-manager-7b787867f4-lgc4h\" (UID: \"11181f5a-47aa-4d9b-b3eb-b6c5868bed4b\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.271121 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.272328 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9r2m\" (UniqueName: \"kubernetes.io/projected/6d78a774-042e-4b7b-9988-971454080ca0-kube-api-access-l9r2m\") pod \"ovn-operator-controller-manager-9976ff44c-9b2qm\" (UID: \"6d78a774-042e-4b7b-9988-971454080ca0\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.272994 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.289823 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.293542 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.296424 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.301303 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw2tc\" (UniqueName: \"kubernetes.io/projected/fad8e76f-5b93-44c1-98d2-3f6f756cc23c-kube-api-access-sw2tc\") pod \"neutron-operator-controller-manager-849d5b9b84-jl46p\" (UID: \"fad8e76f-5b93-44c1-98d2-3f6f756cc23c\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.310464 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mm578" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.316793 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-m5mwg"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.361308 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.370352 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.371912 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1179b9-fc96-402c-9387-7fb33c26a489-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8crsqm8\" (UID: \"2f1179b9-fc96-402c-9387-7fb33c26a489\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.371939 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.371942 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdgs\" (UniqueName: \"kubernetes.io/projected/811049e5-2659-408a-9370-77fe827766e1-kube-api-access-qgdgs\") pod \"placement-operator-controller-manager-589c58c6c-grf94\" (UID: \"811049e5-2659-408a-9370-77fe827766e1\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.371996 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt8zr\" (UniqueName: \"kubernetes.io/projected/e5168b35-e85c-47e3-a641-e7003a2dbae7-kube-api-access-rt8zr\") pod \"swift-operator-controller-manager-84d6b4b759-8cnk4\" (UID: \"e5168b35-e85c-47e3-a641-e7003a2dbae7\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.372029 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvwvw\" (UniqueName: \"kubernetes.io/projected/2f1179b9-fc96-402c-9387-7fb33c26a489-kube-api-access-wvwvw\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8crsqm8\" (UID: \"2f1179b9-fc96-402c-9387-7fb33c26a489\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:54 crc kubenswrapper[4553]: E0930 19:45:54.372396 4553 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 19:45:54 crc kubenswrapper[4553]: E0930 19:45:54.372440 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1179b9-fc96-402c-9387-7fb33c26a489-cert podName:2f1179b9-fc96-402c-9387-7fb33c26a489 nodeName:}" failed. No retries permitted until 2025-09-30 19:45:54.872427342 +0000 UTC m=+808.071929472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f1179b9-fc96-402c-9387-7fb33c26a489-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" (UID: "2f1179b9-fc96-402c-9387-7fb33c26a489") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.377896 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fcc5s" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.390270 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-m5mwg"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.411298 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvwvw\" (UniqueName: \"kubernetes.io/projected/2f1179b9-fc96-402c-9387-7fb33c26a489-kube-api-access-wvwvw\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8crsqm8\" (UID: \"2f1179b9-fc96-402c-9387-7fb33c26a489\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.418399 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.424501 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.426596 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdgs\" (UniqueName: \"kubernetes.io/projected/811049e5-2659-408a-9370-77fe827766e1-kube-api-access-qgdgs\") pod \"placement-operator-controller-manager-589c58c6c-grf94\" (UID: \"811049e5-2659-408a-9370-77fe827766e1\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.436000 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.456577 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.457310 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-pkc9s" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.473548 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert\") pod \"infra-operator-controller-manager-9d6c5db85-tk4kk\" (UID: \"1b3e5dca-afd2-42de-a39a-e4e6fda92e90\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.475421 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfhp5\" (UniqueName: \"kubernetes.io/projected/0f072097-ae5a-4f90-86bb-8308893409d4-kube-api-access-cfhp5\") pod \"test-operator-controller-manager-85777745bb-m5mwg\" (UID: \"0f072097-ae5a-4f90-86bb-8308893409d4\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.475693 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2m4v\" (UniqueName: \"kubernetes.io/projected/24a84108-9502-4a67-8452-7ebdf2e358ed-kube-api-access-w2m4v\") pod \"telemetry-operator-controller-manager-b8d54b5d7-hqq9l\" (UID: \"24a84108-9502-4a67-8452-7ebdf2e358ed\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.475840 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt8zr\" (UniqueName: \"kubernetes.io/projected/e5168b35-e85c-47e3-a641-e7003a2dbae7-kube-api-access-rt8zr\") pod \"swift-operator-controller-manager-84d6b4b759-8cnk4\" (UID: \"e5168b35-e85c-47e3-a641-e7003a2dbae7\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" Sep 30 19:45:54 crc kubenswrapper[4553]: E0930 19:45:54.474823 4553 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 19:45:54 crc kubenswrapper[4553]: E0930 19:45:54.476376 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert podName:1b3e5dca-afd2-42de-a39a-e4e6fda92e90 nodeName:}" failed. No retries permitted until 2025-09-30 19:45:55.47636154 +0000 UTC m=+808.675863670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert") pod "infra-operator-controller-manager-9d6c5db85-tk4kk" (UID: "1b3e5dca-afd2-42de-a39a-e4e6fda92e90") : secret "infra-operator-webhook-server-cert" not found Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.518460 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.538113 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.539646 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.544758 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.545048 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-k6hxj" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.549321 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.562357 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.563265 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.565652 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn"] Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.567679 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mg94x" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.568622 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt8zr\" (UniqueName: \"kubernetes.io/projected/e5168b35-e85c-47e3-a641-e7003a2dbae7-kube-api-access-rt8zr\") pod \"swift-operator-controller-manager-84d6b4b759-8cnk4\" (UID: \"e5168b35-e85c-47e3-a641-e7003a2dbae7\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.576561 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert\") pod \"openstack-operator-controller-manager-98d66ccb9-2pvxf\" (UID: \"2840394a-f6dd-4890-aec6-aab3f4f9eaba\") " pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.576616 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxgf6\" (UniqueName: \"kubernetes.io/projected/826f8aa9-d307-4845-8e61-dc907c69a18c-kube-api-access-hxgf6\") pod \"watcher-operator-controller-manager-6b9957f54f-kgtt2\" (UID: \"826f8aa9-d307-4845-8e61-dc907c69a18c\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.576665 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfhp5\" (UniqueName: \"kubernetes.io/projected/0f072097-ae5a-4f90-86bb-8308893409d4-kube-api-access-cfhp5\") pod \"test-operator-controller-manager-85777745bb-m5mwg\" (UID: \"0f072097-ae5a-4f90-86bb-8308893409d4\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.576698 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ml68\" (UniqueName: \"kubernetes.io/projected/2840394a-f6dd-4890-aec6-aab3f4f9eaba-kube-api-access-2ml68\") pod \"openstack-operator-controller-manager-98d66ccb9-2pvxf\" (UID: \"2840394a-f6dd-4890-aec6-aab3f4f9eaba\") " pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.576716 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp27s\" (UniqueName: \"kubernetes.io/projected/899fda30-4ef6-499d-961b-6e23466c55e3-kube-api-access-sp27s\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-nbccn\" (UID: \"899fda30-4ef6-499d-961b-6e23466c55e3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.576735 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2m4v\" (UniqueName: \"kubernetes.io/projected/24a84108-9502-4a67-8452-7ebdf2e358ed-kube-api-access-w2m4v\") pod \"telemetry-operator-controller-manager-b8d54b5d7-hqq9l\" (UID: \"24a84108-9502-4a67-8452-7ebdf2e358ed\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.613851 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2m4v\" (UniqueName: \"kubernetes.io/projected/24a84108-9502-4a67-8452-7ebdf2e358ed-kube-api-access-w2m4v\") pod \"telemetry-operator-controller-manager-b8d54b5d7-hqq9l\" (UID: \"24a84108-9502-4a67-8452-7ebdf2e358ed\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.614345 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfhp5\" (UniqueName: \"kubernetes.io/projected/0f072097-ae5a-4f90-86bb-8308893409d4-kube-api-access-cfhp5\") pod \"test-operator-controller-manager-85777745bb-m5mwg\" (UID: \"0f072097-ae5a-4f90-86bb-8308893409d4\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.681849 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxgf6\" (UniqueName: \"kubernetes.io/projected/826f8aa9-d307-4845-8e61-dc907c69a18c-kube-api-access-hxgf6\") pod \"watcher-operator-controller-manager-6b9957f54f-kgtt2\" (UID: \"826f8aa9-d307-4845-8e61-dc907c69a18c\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.681941 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ml68\" (UniqueName: \"kubernetes.io/projected/2840394a-f6dd-4890-aec6-aab3f4f9eaba-kube-api-access-2ml68\") pod \"openstack-operator-controller-manager-98d66ccb9-2pvxf\" (UID: \"2840394a-f6dd-4890-aec6-aab3f4f9eaba\") " pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.681962 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp27s\" (UniqueName: \"kubernetes.io/projected/899fda30-4ef6-499d-961b-6e23466c55e3-kube-api-access-sp27s\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-nbccn\" (UID: \"899fda30-4ef6-499d-961b-6e23466c55e3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.682023 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert\") pod \"openstack-operator-controller-manager-98d66ccb9-2pvxf\" (UID: \"2840394a-f6dd-4890-aec6-aab3f4f9eaba\") " pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:54 crc kubenswrapper[4553]: E0930 19:45:54.682211 4553 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 19:45:54 crc kubenswrapper[4553]: E0930 19:45:54.682261 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert podName:2840394a-f6dd-4890-aec6-aab3f4f9eaba nodeName:}" failed. No retries permitted until 2025-09-30 19:45:55.182245194 +0000 UTC m=+808.381747324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert") pod "openstack-operator-controller-manager-98d66ccb9-2pvxf" (UID: "2840394a-f6dd-4890-aec6-aab3f4f9eaba") : secret "webhook-server-cert" not found Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.726711 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxgf6\" (UniqueName: \"kubernetes.io/projected/826f8aa9-d307-4845-8e61-dc907c69a18c-kube-api-access-hxgf6\") pod \"watcher-operator-controller-manager-6b9957f54f-kgtt2\" (UID: \"826f8aa9-d307-4845-8e61-dc907c69a18c\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.729348 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.731590 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ml68\" (UniqueName: \"kubernetes.io/projected/2840394a-f6dd-4890-aec6-aab3f4f9eaba-kube-api-access-2ml68\") pod \"openstack-operator-controller-manager-98d66ccb9-2pvxf\" (UID: \"2840394a-f6dd-4890-aec6-aab3f4f9eaba\") " pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.748594 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp27s\" (UniqueName: \"kubernetes.io/projected/899fda30-4ef6-499d-961b-6e23466c55e3-kube-api-access-sp27s\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-nbccn\" (UID: \"899fda30-4ef6-499d-961b-6e23466c55e3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.767703 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.777062 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.828326 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.887676 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1179b9-fc96-402c-9387-7fb33c26a489-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8crsqm8\" (UID: \"2f1179b9-fc96-402c-9387-7fb33c26a489\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.898742 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1179b9-fc96-402c-9387-7fb33c26a489-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8crsqm8\" (UID: \"2f1179b9-fc96-402c-9387-7fb33c26a489\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:54 crc kubenswrapper[4553]: I0930 19:45:54.968122 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.083408 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.131467 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6"] Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.137899 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5"] Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.166887 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk"] Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.198807 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert\") pod \"openstack-operator-controller-manager-98d66ccb9-2pvxf\" (UID: \"2840394a-f6dd-4890-aec6-aab3f4f9eaba\") " pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:55 crc kubenswrapper[4553]: E0930 19:45:55.198944 4553 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 19:45:55 crc kubenswrapper[4553]: E0930 19:45:55.198987 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert podName:2840394a-f6dd-4890-aec6-aab3f4f9eaba nodeName:}" failed. No retries permitted until 2025-09-30 19:45:56.198974439 +0000 UTC m=+809.398476569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert") pod "openstack-operator-controller-manager-98d66ccb9-2pvxf" (UID: "2840394a-f6dd-4890-aec6-aab3f4f9eaba") : secret "webhook-server-cert" not found Sep 30 19:45:55 crc kubenswrapper[4553]: W0930 19:45:55.237132 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2339ca48_ee02_4443_a1fd_4ae2456f6569.slice/crio-aa356b524894fdcb89942f0aff0640306701b915d332e8aabed2b2a44ee369b6 WatchSource:0}: Error finding container aa356b524894fdcb89942f0aff0640306701b915d332e8aabed2b2a44ee369b6: Status 404 returned error can't find the container with id aa356b524894fdcb89942f0aff0640306701b915d332e8aabed2b2a44ee369b6 Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.246652 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hqf8l"] Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.253359 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.265632 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqf8l"] Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.301024 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-utilities\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.301131 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8lht\" (UniqueName: \"kubernetes.io/projected/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-kube-api-access-h8lht\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.301164 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-catalog-content\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.391213 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" event={"ID":"485f984b-4520-4753-b6e7-4584137d3d58","Type":"ContainerStarted","Data":"ee5a5962ef5f4121c99cc97bc575b9983f2421028680d86ede20be3122b97d3b"} Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.399339 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" event={"ID":"2339ca48-ee02-4443-a1fd-4ae2456f6569","Type":"ContainerStarted","Data":"aa356b524894fdcb89942f0aff0640306701b915d332e8aabed2b2a44ee369b6"} Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.400596 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" event={"ID":"aebfd6cd-5a72-4797-b16a-492efaa1016e","Type":"ContainerStarted","Data":"1986c78e204cb8afdf7a1402f481ed4b8d5a2dc743652ff59eea0665e1cec372"} Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.403378 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-utilities\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.403629 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8lht\" (UniqueName: \"kubernetes.io/projected/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-kube-api-access-h8lht\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.403763 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-catalog-content\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.404436 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-catalog-content\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.406805 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-utilities\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.453754 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8lht\" (UniqueName: \"kubernetes.io/projected/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-kube-api-access-h8lht\") pod \"community-operators-hqf8l\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.464109 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp"] Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.509004 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert\") pod \"infra-operator-controller-manager-9d6c5db85-tk4kk\" (UID: \"1b3e5dca-afd2-42de-a39a-e4e6fda92e90\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.512129 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b3e5dca-afd2-42de-a39a-e4e6fda92e90-cert\") pod \"infra-operator-controller-manager-9d6c5db85-tk4kk\" (UID: \"1b3e5dca-afd2-42de-a39a-e4e6fda92e90\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.514983 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.583554 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.608057 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh"] Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.966219 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc"] Sep 30 19:45:55 crc kubenswrapper[4553]: I0930 19:45:55.980562 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.010168 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.015518 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.050154 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.054941 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.147228 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p"] Sep 30 19:45:56 crc kubenswrapper[4553]: W0930 19:45:56.156085 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0036f14_fb94_4336_9e0b_d501cd080bd5.slice/crio-748b243b038d20b62c86d8f4dde22e657be5ace3f6f66dbde75cdb4f6ceb46c6 WatchSource:0}: Error finding container 748b243b038d20b62c86d8f4dde22e657be5ace3f6f66dbde75cdb4f6ceb46c6: Status 404 returned error can't find the container with id 748b243b038d20b62c86d8f4dde22e657be5ace3f6f66dbde75cdb4f6ceb46c6 Sep 30 19:45:56 crc kubenswrapper[4553]: W0930 19:45:56.167977 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad8e76f_5b93_44c1_98d2_3f6f756cc23c.slice/crio-eee95aea8e286cb429e05294890cde0108683ff0fde3c2fabfca6b5a1cc31c4d WatchSource:0}: Error finding container eee95aea8e286cb429e05294890cde0108683ff0fde3c2fabfca6b5a1cc31c4d: Status 404 returned error can't find the container with id eee95aea8e286cb429e05294890cde0108683ff0fde3c2fabfca6b5a1cc31c4d Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.226718 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert\") pod \"openstack-operator-controller-manager-98d66ccb9-2pvxf\" (UID: \"2840394a-f6dd-4890-aec6-aab3f4f9eaba\") " pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.230121 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2840394a-f6dd-4890-aec6-aab3f4f9eaba-cert\") pod \"openstack-operator-controller-manager-98d66ccb9-2pvxf\" (UID: \"2840394a-f6dd-4890-aec6-aab3f4f9eaba\") " pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.368637 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.400872 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm"] Sep 30 19:45:56 crc kubenswrapper[4553]: W0930 19:45:56.406573 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d78a774_042e_4b7b_9988_971454080ca0.slice/crio-002ac3de33498c712533b5a88d6eabe7a87d3220a499f019e6b9df9074be25d6 WatchSource:0}: Error finding container 002ac3de33498c712533b5a88d6eabe7a87d3220a499f019e6b9df9074be25d6: Status 404 returned error can't find the container with id 002ac3de33498c712533b5a88d6eabe7a87d3220a499f019e6b9df9074be25d6 Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.419772 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-m5mwg"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.438358 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-grf94"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.444828 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2"] Sep 30 19:45:56 crc kubenswrapper[4553]: W0930 19:45:56.447848 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod811049e5_2659_408a_9370_77fe827766e1.slice/crio-78e2f611a8614940dd09922143af17300eefb37791dbb6881d45276a980b84de WatchSource:0}: Error finding container 78e2f611a8614940dd09922143af17300eefb37791dbb6881d45276a980b84de: Status 404 returned error can't find the container with id 78e2f611a8614940dd09922143af17300eefb37791dbb6881d45276a980b84de Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.451527 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" event={"ID":"081051c3-9106-4b8f-8850-42facfbb5583","Type":"ContainerStarted","Data":"bd8c473a022912c458bae449b35215e38dd1cd0820676a3d2bcb0a81a7f5e762"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.458744 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.459358 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" event={"ID":"e973f7e5-4256-4b75-8f51-e01ca131eeca","Type":"ContainerStarted","Data":"999092e7ba845e73595c9bb37418dd077ab81e43d647fd5dea91dd06cffd8bd8"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.461216 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" event={"ID":"a38189f9-08d1-4f4b-8949-4856b0f46d95","Type":"ContainerStarted","Data":"15cfd3560e72c459801fc841e2cae5bdb86c1957ebbae7af31201c53dd663670"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.462641 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" event={"ID":"57489c87-d763-4cf2-a2c6-fd03b1ec7131","Type":"ContainerStarted","Data":"1fc6d480bb8e8e7a2017c3078bf227e9ca7be441af3a8cf79155bb053e812e3b"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.463533 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" event={"ID":"fad8e76f-5b93-44c1-98d2-3f6f756cc23c","Type":"ContainerStarted","Data":"eee95aea8e286cb429e05294890cde0108683ff0fde3c2fabfca6b5a1cc31c4d"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.466388 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" event={"ID":"11181f5a-47aa-4d9b-b3eb-b6c5868bed4b","Type":"ContainerStarted","Data":"e1fcf0d669fa4ca15e90a71d98d32c7039a2e9425a468414ac0d361a3c08ea58"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.473805 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" event={"ID":"e6fe293b-17b2-40c1-ac31-e456a23355b9","Type":"ContainerStarted","Data":"d730f8b8f5730148e3541d4926ad6c6fe8a7f5c024e72100d2aa5bdb9c18dca7"} Sep 30 19:45:56 crc kubenswrapper[4553]: E0930 19:45:56.479517 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qgdgs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-grf94_openstack-operators(811049e5-2659-408a-9370-77fe827766e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.479995 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" event={"ID":"a0036f14-fb94-4336-9e0b-d501cd080bd5","Type":"ContainerStarted","Data":"748b243b038d20b62c86d8f4dde22e657be5ace3f6f66dbde75cdb4f6ceb46c6"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.482736 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" event={"ID":"48e38a70-37cf-4efe-bac1-e0fe7b196b22","Type":"ContainerStarted","Data":"8703250b755019a161aae2c4cbd45ed1323d49cfdd2032a125924ccc18d078b6"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.484362 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" event={"ID":"fd82e0b0-7700-49a2-9a07-2695b2ffe2fc","Type":"ContainerStarted","Data":"c477d520debc708664f2c8e28d1feb8430aff1f8699472caf022493b6c96b3d5"} Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.543219 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l"] Sep 30 19:45:56 crc kubenswrapper[4553]: W0930 19:45:56.567873 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a84108_9502_4a67_8452_7ebdf2e358ed.slice/crio-ec5e9c62daf935c2317da377ce4479d4f2d6a4027c8683a6403918d5f15fc5a1 WatchSource:0}: Error finding container ec5e9c62daf935c2317da377ce4479d4f2d6a4027c8683a6403918d5f15fc5a1: Status 404 returned error can't find the container with id ec5e9c62daf935c2317da377ce4479d4f2d6a4027c8683a6403918d5f15fc5a1 Sep 30 19:45:56 crc kubenswrapper[4553]: E0930 19:45:56.570064 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w2m4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-hqq9l_openstack-operators(24a84108-9502-4a67-8452-7ebdf2e358ed): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:45:56 crc kubenswrapper[4553]: E0930 19:45:56.650901 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" podUID="811049e5-2659-408a-9370-77fe827766e1" Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.688521 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.692563 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk"] Sep 30 19:45:56 crc kubenswrapper[4553]: W0930 19:45:56.696332 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3e5dca_afd2_42de_a39a_e4e6fda92e90.slice/crio-3e1f3604f58ab4efe0473e8ba8bc75ee7d366fb7e8e1189b2b743fe35797201f WatchSource:0}: Error finding container 3e1f3604f58ab4efe0473e8ba8bc75ee7d366fb7e8e1189b2b743fe35797201f: Status 404 returned error can't find the container with id 3e1f3604f58ab4efe0473e8ba8bc75ee7d366fb7e8e1189b2b743fe35797201f Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.712994 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8"] Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.766532 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hqf8l"] Sep 30 19:45:56 crc kubenswrapper[4553]: E0930 19:45:56.779838 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvwvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77b9676b8crsqm8_openstack-operators(2f1179b9-fc96-402c-9387-7fb33c26a489): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:45:56 crc kubenswrapper[4553]: E0930 19:45:56.780369 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sp27s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-nbccn_openstack-operators(899fda30-4ef6-499d-961b-6e23466c55e3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 19:45:56 crc kubenswrapper[4553]: E0930 19:45:56.781848 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" podUID="899fda30-4ef6-499d-961b-6e23466c55e3" Sep 30 19:45:56 crc kubenswrapper[4553]: W0930 19:45:56.788817 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d236a26_bc53_4b88_a8f0_ef72f5ea899b.slice/crio-42369be7e9139f64015536799b46e87296b1b23533cdce8eced83bff5d26e764 WatchSource:0}: Error finding container 42369be7e9139f64015536799b46e87296b1b23533cdce8eced83bff5d26e764: Status 404 returned error can't find the container with id 42369be7e9139f64015536799b46e87296b1b23533cdce8eced83bff5d26e764 Sep 30 19:45:56 crc kubenswrapper[4553]: I0930 19:45:56.845941 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4"] Sep 30 19:45:56 crc kubenswrapper[4553]: W0930 19:45:56.869955 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5168b35_e85c_47e3_a641_e7003a2dbae7.slice/crio-d62978101e7e02ad20b1e87a103db7f9194ba38ae625160f3841b7cc3ecceaca WatchSource:0}: Error finding container d62978101e7e02ad20b1e87a103db7f9194ba38ae625160f3841b7cc3ecceaca: Status 404 returned error can't find the container with id d62978101e7e02ad20b1e87a103db7f9194ba38ae625160f3841b7cc3ecceaca Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.022362 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf"] Sep 30 19:45:57 crc kubenswrapper[4553]: E0930 19:45:57.074506 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" podUID="24a84108-9502-4a67-8452-7ebdf2e358ed" Sep 30 19:45:57 crc kubenswrapper[4553]: E0930 19:45:57.186242 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" podUID="2f1179b9-fc96-402c-9387-7fb33c26a489" Sep 30 19:45:57 crc kubenswrapper[4553]: E0930 19:45:57.558946 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" podUID="2f1179b9-fc96-402c-9387-7fb33c26a489" Sep 30 19:45:57 crc kubenswrapper[4553]: E0930 19:45:57.559372 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" podUID="899fda30-4ef6-499d-961b-6e23466c55e3" Sep 30 19:45:57 crc kubenswrapper[4553]: E0930 19:45:57.578161 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" podUID="24a84108-9502-4a67-8452-7ebdf2e358ed" Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621856 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" event={"ID":"2f1179b9-fc96-402c-9387-7fb33c26a489","Type":"ContainerStarted","Data":"ab93457e640c2d3f507a802671ecfe77fc55091ec53d488cccaffaa2889f42b6"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621895 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" event={"ID":"2f1179b9-fc96-402c-9387-7fb33c26a489","Type":"ContainerStarted","Data":"cff69e4f75de97ad34b72279d09bf630186e1d50bdba6ee70a1ae89f8b98505b"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621911 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" event={"ID":"6d78a774-042e-4b7b-9988-971454080ca0","Type":"ContainerStarted","Data":"002ac3de33498c712533b5a88d6eabe7a87d3220a499f019e6b9df9074be25d6"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621928 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621939 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" event={"ID":"899fda30-4ef6-499d-961b-6e23466c55e3","Type":"ContainerStarted","Data":"12146f03a76d7b66ceb4f89c1fcd3f8f259d403e96a1cda36c49f71301f7a481"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621948 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" event={"ID":"826f8aa9-d307-4845-8e61-dc907c69a18c","Type":"ContainerStarted","Data":"85f84612bbeac86fa6418a44911969011650d52b62871c33aea4398b7aab4cb6"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621965 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" event={"ID":"1b3e5dca-afd2-42de-a39a-e4e6fda92e90","Type":"ContainerStarted","Data":"3e1f3604f58ab4efe0473e8ba8bc75ee7d366fb7e8e1189b2b743fe35797201f"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621975 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" event={"ID":"24a84108-9502-4a67-8452-7ebdf2e358ed","Type":"ContainerStarted","Data":"309a9f5d857749d8ae53aabeb413dce983476c616ea7d942a2ce41dfceefc81b"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621984 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" event={"ID":"24a84108-9502-4a67-8452-7ebdf2e358ed","Type":"ContainerStarted","Data":"ec5e9c62daf935c2317da377ce4479d4f2d6a4027c8683a6403918d5f15fc5a1"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.621992 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" event={"ID":"0f072097-ae5a-4f90-86bb-8308893409d4","Type":"ContainerStarted","Data":"edf554336cfbd4fd62466e2f6f64e5fe6df75236cb5ec4b09c73bd57959fedfa"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.622000 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" event={"ID":"2840394a-f6dd-4890-aec6-aab3f4f9eaba","Type":"ContainerStarted","Data":"8dd4077313489a50a4c63345c62f2f25dadbd2d9054b64e2787b1910c81b8c08"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.622009 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" event={"ID":"2840394a-f6dd-4890-aec6-aab3f4f9eaba","Type":"ContainerStarted","Data":"991fd234e332bf241d0d21723f174f29aece0fb29a9ab5668c41beb5332edbd4"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.622023 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" event={"ID":"2840394a-f6dd-4890-aec6-aab3f4f9eaba","Type":"ContainerStarted","Data":"169335d2a3a6efa217782f28521869986dd6104c818b4d8711c4750cb7730d40"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.622031 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" event={"ID":"811049e5-2659-408a-9370-77fe827766e1","Type":"ContainerStarted","Data":"6b6be139a0a5731368dc451f8b4875291c4ad22a5713f76dccd2287a0f4f2ea1"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.622055 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" event={"ID":"811049e5-2659-408a-9370-77fe827766e1","Type":"ContainerStarted","Data":"78e2f611a8614940dd09922143af17300eefb37791dbb6881d45276a980b84de"} Sep 30 19:45:57 crc kubenswrapper[4553]: E0930 19:45:57.629014 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" podUID="811049e5-2659-408a-9370-77fe827766e1" Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.659936 4553 generic.go:334] "Generic (PLEG): container finished" podID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerID="adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388" exitCode=0 Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.660001 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqf8l" event={"ID":"6d236a26-bc53-4b88-a8f0-ef72f5ea899b","Type":"ContainerDied","Data":"adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.660027 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqf8l" event={"ID":"6d236a26-bc53-4b88-a8f0-ef72f5ea899b","Type":"ContainerStarted","Data":"42369be7e9139f64015536799b46e87296b1b23533cdce8eced83bff5d26e764"} Sep 30 19:45:57 crc kubenswrapper[4553]: I0930 19:45:57.668215 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" event={"ID":"e5168b35-e85c-47e3-a641-e7003a2dbae7","Type":"ContainerStarted","Data":"d62978101e7e02ad20b1e87a103db7f9194ba38ae625160f3841b7cc3ecceaca"} Sep 30 19:45:58 crc kubenswrapper[4553]: I0930 19:45:58.220273 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" podStartSLOduration=4.220258303 podStartE2EDuration="4.220258303s" podCreationTimestamp="2025-09-30 19:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:45:58.217949571 +0000 UTC m=+811.417451721" watchObservedRunningTime="2025-09-30 19:45:58.220258303 +0000 UTC m=+811.419760433" Sep 30 19:45:58 crc kubenswrapper[4553]: E0930 19:45:58.696613 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" podUID="811049e5-2659-408a-9370-77fe827766e1" Sep 30 19:45:58 crc kubenswrapper[4553]: E0930 19:45:58.698851 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" podUID="2f1179b9-fc96-402c-9387-7fb33c26a489" Sep 30 19:45:58 crc kubenswrapper[4553]: E0930 19:45:58.698960 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" podUID="24a84108-9502-4a67-8452-7ebdf2e358ed" Sep 30 19:45:58 crc kubenswrapper[4553]: E0930 19:45:58.717541 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" podUID="899fda30-4ef6-499d-961b-6e23466c55e3" Sep 30 19:46:06 crc kubenswrapper[4553]: I0930 19:46:06.469344 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-98d66ccb9-2pvxf" Sep 30 19:46:09 crc kubenswrapper[4553]: E0930 19:46:09.866572 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8" Sep 30 19:46:09 crc kubenswrapper[4553]: E0930 19:46:09.867240 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sw2tc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-849d5b9b84-jl46p_openstack-operators(fad8e76f-5b93-44c1-98d2-3f6f756cc23c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:10 crc kubenswrapper[4553]: E0930 19:46:10.351251 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06" Sep 30 19:46:10 crc kubenswrapper[4553]: E0930 19:46:10.351685 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hxgf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b9957f54f-kgtt2_openstack-operators(826f8aa9-d307-4845-8e61-dc907c69a18c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:10 crc kubenswrapper[4553]: E0930 19:46:10.754536 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:f5f0d2eb534f763cf6578af513add1c21c1659b2cd75214dfddfedb9eebf6397" Sep 30 19:46:10 crc kubenswrapper[4553]: E0930 19:46:10.754739 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:f5f0d2eb534f763cf6578af513add1c21c1659b2cd75214dfddfedb9eebf6397,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4fslc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-9f4696d94-9bbsc_openstack-operators(fd82e0b0-7700-49a2-9a07-2695b2ffe2fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:12 crc kubenswrapper[4553]: E0930 19:46:12.162869 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f" Sep 30 19:46:12 crc kubenswrapper[4553]: E0930 19:46:12.163166 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59n22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-64cd67b5cb-5lc25_openstack-operators(e6fe293b-17b2-40c1-ac31-e456a23355b9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:12 crc kubenswrapper[4553]: E0930 19:46:12.713239 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884" Sep 30 19:46:12 crc kubenswrapper[4553]: E0930 19:46:12.713408 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bz6kv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6d68dbc695-4sf9s_openstack-operators(48e38a70-37cf-4efe-bac1-e0fe7b196b22): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:14 crc kubenswrapper[4553]: E0930 19:46:14.123796 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:23fcec0642cbd40af10bca0c5d4e538662d21eda98d6dfec37c38b4d7a47191a" Sep 30 19:46:14 crc kubenswrapper[4553]: E0930 19:46:14.124415 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:23fcec0642cbd40af10bca0c5d4e538662d21eda98d6dfec37c38b4d7a47191a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-5bd55b4bff-sjwqp_openstack-operators(e973f7e5-4256-4b75-8f51-e01ca131eeca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:14 crc kubenswrapper[4553]: E0930 19:46:14.571671 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:bb39758cc8cd0d2cd02841dc81b53fd88647e2db15ee16cdd8c44d4098a942fd" Sep 30 19:46:14 crc kubenswrapper[4553]: E0930 19:46:14.572279 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:bb39758cc8cd0d2cd02841dc81b53fd88647e2db15ee16cdd8c44d4098a942fd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ddn7d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6ff8b75857-r2zwk_openstack-operators(aebfd6cd-5a72-4797-b16a-492efaa1016e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:15 crc kubenswrapper[4553]: E0930 19:46:15.508193 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3" Sep 30 19:46:15 crc kubenswrapper[4553]: E0930 19:46:15.508544 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nd48r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-9d6c5db85-tk4kk_openstack-operators(1b3e5dca-afd2-42de-a39a-e4e6fda92e90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:17 crc kubenswrapper[4553]: E0930 19:46:17.193647 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4" Sep 30 19:46:17 crc kubenswrapper[4553]: E0930 19:46:17.194212 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rt8zr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-84d6b4b759-8cnk4_openstack-operators(e5168b35-e85c-47e3-a641-e7003a2dbae7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:46:19 crc kubenswrapper[4553]: E0930 19:46:19.187602 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" podUID="826f8aa9-d307-4845-8e61-dc907c69a18c" Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.872393 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" event={"ID":"6d78a774-042e-4b7b-9988-971454080ca0","Type":"ContainerStarted","Data":"feeb15001eddc6a992bdcf5ce02acdeed4da650db094f15deaae887be354b3c0"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.879363 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" event={"ID":"826f8aa9-d307-4845-8e61-dc907c69a18c","Type":"ContainerStarted","Data":"3ab0fb38d35b01bca8e0c518024248297c48f3e9bfde3226462b4c9a1e749289"} Sep 30 19:46:19 crc kubenswrapper[4553]: E0930 19:46:19.880529 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" podUID="826f8aa9-d307-4845-8e61-dc907c69a18c" Sep 30 19:46:19 crc kubenswrapper[4553]: E0930 19:46:19.884624 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" podUID="fd82e0b0-7700-49a2-9a07-2695b2ffe2fc" Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.886970 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" event={"ID":"2339ca48-ee02-4443-a1fd-4ae2456f6569","Type":"ContainerStarted","Data":"9290abf7f2c1642945d160c187e32fb481b3ed4c3ea87d13e56071dee0b1884a"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.891277 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" event={"ID":"a38189f9-08d1-4f4b-8949-4856b0f46d95","Type":"ContainerStarted","Data":"9d6b59fa1940be95300ae8067cb2a7975227fec83147b94712515cea54c61d8a"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.898693 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" event={"ID":"0f072097-ae5a-4f90-86bb-8308893409d4","Type":"ContainerStarted","Data":"c2623911805aa156cc57eb792d4fd497c4572fa0b1188667943c9c7c912b84a9"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.901580 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" event={"ID":"a0036f14-fb94-4336-9e0b-d501cd080bd5","Type":"ContainerStarted","Data":"e14deb95e8268198fad6fd85e1354cd486ce8b1e5b2e5aa8c35fdeb2f52e7c08"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.907084 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" event={"ID":"57489c87-d763-4cf2-a2c6-fd03b1ec7131","Type":"ContainerStarted","Data":"3fb10e58573180fe4a56de5231805fc063bff417bb711a3362a5bc4aef7d4311"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.913615 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" event={"ID":"11181f5a-47aa-4d9b-b3eb-b6c5868bed4b","Type":"ContainerStarted","Data":"f228212955dd689e7740b39853c63312892edd607dc33e9db77d0f5960d6b38c"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.939949 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" event={"ID":"485f984b-4520-4753-b6e7-4584137d3d58","Type":"ContainerStarted","Data":"b831bb84a8ec9f6084780cbe33856c5f21d96f2d5c1056051c12b251f10a70d7"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.947757 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqf8l" event={"ID":"6d236a26-bc53-4b88-a8f0-ef72f5ea899b","Type":"ContainerStarted","Data":"fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a"} Sep 30 19:46:19 crc kubenswrapper[4553]: I0930 19:46:19.950575 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" event={"ID":"081051c3-9106-4b8f-8850-42facfbb5583","Type":"ContainerStarted","Data":"35a92318289735fb154f0fc6c83fb4695a82237cf9950688ece83f6470eca0b0"} Sep 30 19:46:20 crc kubenswrapper[4553]: I0930 19:46:20.007460 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" event={"ID":"2f1179b9-fc96-402c-9387-7fb33c26a489","Type":"ContainerStarted","Data":"b4f833d4ade54696cf0a6eadfda5d943db3dfb822bfa383ed01041b2f776bff3"} Sep 30 19:46:20 crc kubenswrapper[4553]: I0930 19:46:20.008418 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:46:20 crc kubenswrapper[4553]: I0930 19:46:20.064204 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" podStartSLOduration=5.12067289 podStartE2EDuration="27.064187649s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.779467283 +0000 UTC m=+809.978969413" lastFinishedPulling="2025-09-30 19:46:18.722982042 +0000 UTC m=+831.922484172" observedRunningTime="2025-09-30 19:46:20.060367317 +0000 UTC m=+833.259869447" watchObservedRunningTime="2025-09-30 19:46:20.064187649 +0000 UTC m=+833.263689779" Sep 30 19:46:20 crc kubenswrapper[4553]: E0930 19:46:20.102190 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" podUID="1b3e5dca-afd2-42de-a39a-e4e6fda92e90" Sep 30 19:46:20 crc kubenswrapper[4553]: E0930 19:46:20.162682 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" podUID="fad8e76f-5b93-44c1-98d2-3f6f756cc23c" Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.017797 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" event={"ID":"fad8e76f-5b93-44c1-98d2-3f6f756cc23c","Type":"ContainerStarted","Data":"5817f9ba0448cb889377ce27a051a1f0bdf9a69073adc728cb8d13def41ba022"} Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.020770 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" event={"ID":"899fda30-4ef6-499d-961b-6e23466c55e3","Type":"ContainerStarted","Data":"829bbcae7b3e212c5c6f2e4297690a22ddef94d3b01db815ed9bfe6abc9e0336"} Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.026993 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" event={"ID":"2339ca48-ee02-4443-a1fd-4ae2456f6569","Type":"ContainerStarted","Data":"081e62de2927cd4e371ff29c036564b455db429f027c309de7b85e45834f2242"} Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.035333 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" event={"ID":"fd82e0b0-7700-49a2-9a07-2695b2ffe2fc","Type":"ContainerStarted","Data":"723b0697d84a32ebaac19d174cf7d938dff5ce00c9cdfcec2eae1a8ef2453a84"} Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.046789 4553 generic.go:334] "Generic (PLEG): container finished" podID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerID="fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a" exitCode=0 Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.046845 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqf8l" event={"ID":"6d236a26-bc53-4b88-a8f0-ef72f5ea899b","Type":"ContainerDied","Data":"fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a"} Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.057623 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" event={"ID":"1b3e5dca-afd2-42de-a39a-e4e6fda92e90","Type":"ContainerStarted","Data":"1d25b0b3b53f39f5c0537de4322693b5a8ad99b9cd3c7fb2d737670de27eadc4"} Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.062247 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" event={"ID":"24a84108-9502-4a67-8452-7ebdf2e358ed","Type":"ContainerStarted","Data":"6c97be4909abd8261ec27c38880b1f7bb5e77a08a4aa8f4a7dc12f12f617e0a0"} Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.062762 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" Sep 30 19:46:21 crc kubenswrapper[4553]: E0930 19:46:21.065148 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" podUID="1b3e5dca-afd2-42de-a39a-e4e6fda92e90" Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.082069 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-nbccn" podStartSLOduration=5.109695162 podStartE2EDuration="27.082053462s" podCreationTimestamp="2025-09-30 19:45:54 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.780204083 +0000 UTC m=+809.979706213" lastFinishedPulling="2025-09-30 19:46:18.752562383 +0000 UTC m=+831.952064513" observedRunningTime="2025-09-30 19:46:21.075748234 +0000 UTC m=+834.275250364" watchObservedRunningTime="2025-09-30 19:46:21.082053462 +0000 UTC m=+834.281555592" Sep 30 19:46:21 crc kubenswrapper[4553]: I0930 19:46:21.131584 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" podStartSLOduration=5.949117828 podStartE2EDuration="28.131566676s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.569941981 +0000 UTC m=+809.769444101" lastFinishedPulling="2025-09-30 19:46:18.752390819 +0000 UTC m=+831.951892949" observedRunningTime="2025-09-30 19:46:21.130685112 +0000 UTC m=+834.330187242" watchObservedRunningTime="2025-09-30 19:46:21.131566676 +0000 UTC m=+834.331068816" Sep 30 19:46:21 crc kubenswrapper[4553]: E0930 19:46:21.997736 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" podUID="aebfd6cd-5a72-4797-b16a-492efaa1016e" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.024397 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" podUID="e6fe293b-17b2-40c1-ac31-e456a23355b9" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.049397 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" podUID="e5168b35-e85c-47e3-a641-e7003a2dbae7" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.053103 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" podUID="e973f7e5-4256-4b75-8f51-e01ca131eeca" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.054784 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" podUID="48e38a70-37cf-4efe-bac1-e0fe7b196b22" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.068395 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" event={"ID":"e973f7e5-4256-4b75-8f51-e01ca131eeca","Type":"ContainerStarted","Data":"65f3c33e9de0bc8685036897a1296caf5d90d487f98916774f12d0d9834ab16b"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.070855 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" event={"ID":"6d78a774-042e-4b7b-9988-971454080ca0","Type":"ContainerStarted","Data":"5a8d786f0a1683cc1b6535ff36d77331bfd1bda5af281a08ef0698012fb6f5b0"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.071226 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.072935 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" event={"ID":"a0036f14-fb94-4336-9e0b-d501cd080bd5","Type":"ContainerStarted","Data":"1ce1695f356122b1fc285cffbfda7cfb7efe8ec665cf08cca6fd44c7eeffe452"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.073291 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.074850 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" event={"ID":"57489c87-d763-4cf2-a2c6-fd03b1ec7131","Type":"ContainerStarted","Data":"f6fcde7ecbbd182883eed18890f50e7f505a72a7c3b2c9192c22118a32b5c606"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.074969 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.077023 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" event={"ID":"081051c3-9106-4b8f-8850-42facfbb5583","Type":"ContainerStarted","Data":"34b989f83b7652ec9a4b74325e3a61193c32be767c776f4821e77ab8b9b6d1a5"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.077474 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.078641 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" event={"ID":"e5168b35-e85c-47e3-a641-e7003a2dbae7","Type":"ContainerStarted","Data":"5965c4ad292ec25443008d52d040009e9d09ef3e9de313f321e642935259bd9c"} Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.079078 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:23fcec0642cbd40af10bca0c5d4e538662d21eda98d6dfec37c38b4d7a47191a\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" podUID="e973f7e5-4256-4b75-8f51-e01ca131eeca" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.080767 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" event={"ID":"11181f5a-47aa-4d9b-b3eb-b6c5868bed4b","Type":"ContainerStarted","Data":"aca105948fa55fd530787fe485f215a3a78bbe715ddb822f8c31db2aff9024a3"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.080831 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.082380 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" event={"ID":"485f984b-4520-4753-b6e7-4584137d3d58","Type":"ContainerStarted","Data":"5a6d27acb62964916188a844ab4cdb9a4e3870077d53e1fc349dc3de4319ff38"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.082730 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.083945 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" event={"ID":"811049e5-2659-408a-9370-77fe827766e1","Type":"ContainerStarted","Data":"c27b2ff4e0a0aa72dd0894b083d98fe057d66cae830cb4193d87cc12db75b899"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.084343 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.085985 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" event={"ID":"aebfd6cd-5a72-4797-b16a-492efaa1016e","Type":"ContainerStarted","Data":"39bd6598260bbf99ac610f0a94fa6d43ac27b754167a1f262e91cea2f3ad79fe"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.087585 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" event={"ID":"e6fe293b-17b2-40c1-ac31-e456a23355b9","Type":"ContainerStarted","Data":"8d12a9d0a2b4a0b361ba42778f16f5e4bf677ac303e4bb4d067921f4b29562dd"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.090024 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" event={"ID":"a38189f9-08d1-4f4b-8949-4856b0f46d95","Type":"ContainerStarted","Data":"f93493211f176dde84cf68d58348b2002ec538ffe6b47428a196981eeee47d62"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.090828 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.090973 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" podUID="e6fe293b-17b2-40c1-ac31-e456a23355b9" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.091073 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" podUID="e5168b35-e85c-47e3-a641-e7003a2dbae7" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.091150 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:bb39758cc8cd0d2cd02841dc81b53fd88647e2db15ee16cdd8c44d4098a942fd\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" podUID="aebfd6cd-5a72-4797-b16a-492efaa1016e" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.092860 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" event={"ID":"0f072097-ae5a-4f90-86bb-8308893409d4","Type":"ContainerStarted","Data":"b83caf67b3e98cc51baab51abf670df1bc4385c48ec311d82d6f82cb383d9f4c"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.093112 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.099646 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" event={"ID":"48e38a70-37cf-4efe-bac1-e0fe7b196b22","Type":"ContainerStarted","Data":"82c2beb39748b12c41a91246372ebc730b36bc801e75221c17ef9e141dae4e21"} Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.112836 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" podStartSLOduration=7.296313495 podStartE2EDuration="29.112813809s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.403245325 +0000 UTC m=+809.602747455" lastFinishedPulling="2025-09-30 19:46:18.219745609 +0000 UTC m=+831.419247769" observedRunningTime="2025-09-30 19:46:22.110350494 +0000 UTC m=+835.309852624" watchObservedRunningTime="2025-09-30 19:46:22.112813809 +0000 UTC m=+835.312315939" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.140784 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" podUID="1b3e5dca-afd2-42de-a39a-e4e6fda92e90" Sep 30 19:46:22 crc kubenswrapper[4553]: E0930 19:46:22.140879 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:4cdb30423c14ab48888aeeb699259bd9051284ec9f874ed9bab94c7965f45884\\\"\"" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" podUID="48e38a70-37cf-4efe-bac1-e0fe7b196b22" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.146147 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" podStartSLOduration=6.5574551020000005 podStartE2EDuration="29.14613065s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.096308739 +0000 UTC m=+809.295810869" lastFinishedPulling="2025-09-30 19:46:18.684984287 +0000 UTC m=+831.884486417" observedRunningTime="2025-09-30 19:46:22.142909834 +0000 UTC m=+835.342411964" watchObservedRunningTime="2025-09-30 19:46:22.14613065 +0000 UTC m=+835.345632780" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.256897 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" podStartSLOduration=6.251186285 podStartE2EDuration="29.256880591s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:55.214458393 +0000 UTC m=+808.413960523" lastFinishedPulling="2025-09-30 19:46:18.220152689 +0000 UTC m=+831.419654829" observedRunningTime="2025-09-30 19:46:22.25385255 +0000 UTC m=+835.453354680" watchObservedRunningTime="2025-09-30 19:46:22.256880591 +0000 UTC m=+835.456382721" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.278528 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" podStartSLOduration=7.505752806 podStartE2EDuration="29.27851252s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.448163386 +0000 UTC m=+809.647665516" lastFinishedPulling="2025-09-30 19:46:18.22092309 +0000 UTC m=+831.420425230" observedRunningTime="2025-09-30 19:46:22.272599071 +0000 UTC m=+835.472101201" watchObservedRunningTime="2025-09-30 19:46:22.27851252 +0000 UTC m=+835.478014650" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.313676 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" podStartSLOduration=7.253736727 podStartE2EDuration="29.313655649s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.161164903 +0000 UTC m=+809.360667033" lastFinishedPulling="2025-09-30 19:46:18.221083825 +0000 UTC m=+831.420585955" observedRunningTime="2025-09-30 19:46:22.308719897 +0000 UTC m=+835.508222027" watchObservedRunningTime="2025-09-30 19:46:22.313655649 +0000 UTC m=+835.513157779" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.335389 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" podStartSLOduration=7.037576829 podStartE2EDuration="29.335372229s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.479376331 +0000 UTC m=+809.678878461" lastFinishedPulling="2025-09-30 19:46:18.777171731 +0000 UTC m=+831.976673861" observedRunningTime="2025-09-30 19:46:22.33129795 +0000 UTC m=+835.530800080" watchObservedRunningTime="2025-09-30 19:46:22.335372229 +0000 UTC m=+835.534874359" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.371520 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" podStartSLOduration=6.801478476 podStartE2EDuration="29.371501285s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.114493165 +0000 UTC m=+809.313995285" lastFinishedPulling="2025-09-30 19:46:18.684515964 +0000 UTC m=+831.884018094" observedRunningTime="2025-09-30 19:46:22.355530239 +0000 UTC m=+835.555032369" watchObservedRunningTime="2025-09-30 19:46:22.371501285 +0000 UTC m=+835.571003415" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.500889 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" podStartSLOduration=7.73558345 podStartE2EDuration="29.500873194s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.454882646 +0000 UTC m=+809.654384766" lastFinishedPulling="2025-09-30 19:46:18.22017238 +0000 UTC m=+831.419674510" observedRunningTime="2025-09-30 19:46:22.497677388 +0000 UTC m=+835.697179508" watchObservedRunningTime="2025-09-30 19:46:22.500873194 +0000 UTC m=+835.700375324" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.578917 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" podStartSLOduration=7.008333217 podStartE2EDuration="29.57889715s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:55.649700739 +0000 UTC m=+808.849202859" lastFinishedPulling="2025-09-30 19:46:18.220264622 +0000 UTC m=+831.419766792" observedRunningTime="2025-09-30 19:46:22.575570181 +0000 UTC m=+835.775072301" watchObservedRunningTime="2025-09-30 19:46:22.57889715 +0000 UTC m=+835.778399280" Sep 30 19:46:22 crc kubenswrapper[4553]: I0930 19:46:22.583760 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" podStartSLOduration=6.154032338 podStartE2EDuration="29.58373792s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:55.254846433 +0000 UTC m=+808.454348563" lastFinishedPulling="2025-09-30 19:46:18.684551985 +0000 UTC m=+831.884054145" observedRunningTime="2025-09-30 19:46:22.54672089 +0000 UTC m=+835.746223020" watchObservedRunningTime="2025-09-30 19:46:22.58373792 +0000 UTC m=+835.783240050" Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.106884 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" event={"ID":"826f8aa9-d307-4845-8e61-dc907c69a18c","Type":"ContainerStarted","Data":"7254df52795bc4047b60a228b392273ae06a8d05432d4c1b1ca25dfce6eaedd9"} Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.107166 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.108908 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" event={"ID":"fd82e0b0-7700-49a2-9a07-2695b2ffe2fc","Type":"ContainerStarted","Data":"2a309694fbae0f71201d94d04e59eb706cd78759d8524a2a7ef4b2b6785eaa43"} Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.109074 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.110808 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqf8l" event={"ID":"6d236a26-bc53-4b88-a8f0-ef72f5ea899b","Type":"ContainerStarted","Data":"454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497"} Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.112601 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" event={"ID":"fad8e76f-5b93-44c1-98d2-3f6f756cc23c","Type":"ContainerStarted","Data":"f46bee66f4171941ef4276fed5c112f5956e009416b77bd09c19ce4f07b78ede"} Sep 30 19:46:23 crc kubenswrapper[4553]: E0930 19:46:23.114006 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" podUID="e5168b35-e85c-47e3-a641-e7003a2dbae7" Sep 30 19:46:23 crc kubenswrapper[4553]: E0930 19:46:23.114898 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:23fcec0642cbd40af10bca0c5d4e538662d21eda98d6dfec37c38b4d7a47191a\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" podUID="e973f7e5-4256-4b75-8f51-e01ca131eeca" Sep 30 19:46:23 crc kubenswrapper[4553]: E0930 19:46:23.114980 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:bb39758cc8cd0d2cd02841dc81b53fd88647e2db15ee16cdd8c44d4098a942fd\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" podUID="aebfd6cd-5a72-4797-b16a-492efaa1016e" Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.133459 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" podStartSLOduration=3.415344923 podStartE2EDuration="29.133440505s" podCreationTimestamp="2025-09-30 19:45:54 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.450713494 +0000 UTC m=+809.650215624" lastFinishedPulling="2025-09-30 19:46:22.168809076 +0000 UTC m=+835.368311206" observedRunningTime="2025-09-30 19:46:23.133189729 +0000 UTC m=+836.332691859" watchObservedRunningTime="2025-09-30 19:46:23.133440505 +0000 UTC m=+836.332942635" Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.219186 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" podStartSLOduration=3.860966362 podStartE2EDuration="30.219170977s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.174222743 +0000 UTC m=+809.373724873" lastFinishedPulling="2025-09-30 19:46:22.532427358 +0000 UTC m=+835.731929488" observedRunningTime="2025-09-30 19:46:23.211469792 +0000 UTC m=+836.410971922" watchObservedRunningTime="2025-09-30 19:46:23.219170977 +0000 UTC m=+836.418673107" Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.243951 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hqf8l" podStartSLOduration=3.816376559 podStartE2EDuration="28.243935049s" podCreationTimestamp="2025-09-30 19:45:55 +0000 UTC" firstStartedPulling="2025-09-30 19:45:57.664730751 +0000 UTC m=+810.864232881" lastFinishedPulling="2025-09-30 19:46:22.092289241 +0000 UTC m=+835.291791371" observedRunningTime="2025-09-30 19:46:23.240699963 +0000 UTC m=+836.440202093" watchObservedRunningTime="2025-09-30 19:46:23.243935049 +0000 UTC m=+836.443437179" Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.283413 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" podStartSLOduration=4.035852536 podStartE2EDuration="30.283396744s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.10084487 +0000 UTC m=+809.300347000" lastFinishedPulling="2025-09-30 19:46:22.348389078 +0000 UTC m=+835.547891208" observedRunningTime="2025-09-30 19:46:23.280889478 +0000 UTC m=+836.480391608" watchObservedRunningTime="2025-09-30 19:46:23.283396744 +0000 UTC m=+836.482898874" Sep 30 19:46:23 crc kubenswrapper[4553]: I0930 19:46:23.885827 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.079120 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-s8nwm" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.141351 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" event={"ID":"48e38a70-37cf-4efe-bac1-e0fe7b196b22","Type":"ContainerStarted","Data":"dc277b6c44b16dc513a68df1f29dfd997a62a73399c12f1fdb453a5f501b9c57"} Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.142434 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.144986 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" event={"ID":"e6fe293b-17b2-40c1-ac31-e456a23355b9","Type":"ContainerStarted","Data":"fa8aea12f2103c1cae5fb399dc279cc5d7548483f66187313b41fc9392eacdeb"} Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.147007 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.147085 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.150970 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-9b2qm" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.153796 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-xk7kj" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.153878 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-r7vxh" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.154174 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-qhv4n" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.154754 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-5k8k5" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.160224 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" podStartSLOduration=3.661607113 podStartE2EDuration="31.160209647s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.152194033 +0000 UTC m=+809.351696153" lastFinishedPulling="2025-09-30 19:46:23.650796557 +0000 UTC m=+836.850298687" observedRunningTime="2025-09-30 19:46:24.155975373 +0000 UTC m=+837.355477503" watchObservedRunningTime="2025-09-30 19:46:24.160209647 +0000 UTC m=+837.359711777" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.210207 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" podStartSLOduration=3.694287746 podStartE2EDuration="31.210187892s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.130911135 +0000 UTC m=+809.330413265" lastFinishedPulling="2025-09-30 19:46:23.646811281 +0000 UTC m=+836.846313411" observedRunningTime="2025-09-30 19:46:24.192712845 +0000 UTC m=+837.392214985" watchObservedRunningTime="2025-09-30 19:46:24.210187892 +0000 UTC m=+837.409690022" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.375751 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-lgc4h" Sep 30 19:46:24 crc kubenswrapper[4553]: I0930 19:46:24.770895 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-m5mwg" Sep 30 19:46:25 crc kubenswrapper[4553]: I0930 19:46:25.094634 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8crsqm8" Sep 30 19:46:25 crc kubenswrapper[4553]: I0930 19:46:25.584721 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:46:25 crc kubenswrapper[4553]: I0930 19:46:25.584788 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:46:26 crc kubenswrapper[4553]: I0930 19:46:26.643895 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hqf8l" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="registry-server" probeResult="failure" output=< Sep 30 19:46:26 crc kubenswrapper[4553]: timeout: failed to connect service ":50051" within 1s Sep 30 19:46:26 crc kubenswrapper[4553]: > Sep 30 19:46:33 crc kubenswrapper[4553]: I0930 19:46:33.889427 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-ghxx6" Sep 30 19:46:33 crc kubenswrapper[4553]: I0930 19:46:33.925853 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-9bbsc" Sep 30 19:46:34 crc kubenswrapper[4553]: I0930 19:46:34.108921 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-4sf9s" Sep 30 19:46:34 crc kubenswrapper[4553]: I0930 19:46:34.276583 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-5lc25" Sep 30 19:46:34 crc kubenswrapper[4553]: I0930 19:46:34.373437 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-jl46p" Sep 30 19:46:34 crc kubenswrapper[4553]: I0930 19:46:34.523174 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-grf94" Sep 30 19:46:34 crc kubenswrapper[4553]: I0930 19:46:34.733336 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-hqq9l" Sep 30 19:46:34 crc kubenswrapper[4553]: I0930 19:46:34.780163 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kgtt2" Sep 30 19:46:35 crc kubenswrapper[4553]: I0930 19:46:35.251973 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" event={"ID":"aebfd6cd-5a72-4797-b16a-492efaa1016e","Type":"ContainerStarted","Data":"872cacbb39efaca35c3b7bf5690feab9e3c4172f131b12365be2866fcbeb42c8"} Sep 30 19:46:35 crc kubenswrapper[4553]: I0930 19:46:35.252310 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" Sep 30 19:46:35 crc kubenswrapper[4553]: I0930 19:46:35.269796 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" podStartSLOduration=2.4858002949999998 podStartE2EDuration="42.269774975s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:55.220827144 +0000 UTC m=+808.420329274" lastFinishedPulling="2025-09-30 19:46:35.004801824 +0000 UTC m=+848.204303954" observedRunningTime="2025-09-30 19:46:35.266930569 +0000 UTC m=+848.466432729" watchObservedRunningTime="2025-09-30 19:46:35.269774975 +0000 UTC m=+848.469277115" Sep 30 19:46:35 crc kubenswrapper[4553]: I0930 19:46:35.622295 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:46:35 crc kubenswrapper[4553]: I0930 19:46:35.664969 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:46:35 crc kubenswrapper[4553]: I0930 19:46:35.854713 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hqf8l"] Sep 30 19:46:36 crc kubenswrapper[4553]: I0930 19:46:36.264938 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" event={"ID":"1b3e5dca-afd2-42de-a39a-e4e6fda92e90","Type":"ContainerStarted","Data":"017d49aa3ffa2943d9088f29f4becc1a0a76aaa41c22c3064ff17716eca778dc"} Sep 30 19:46:36 crc kubenswrapper[4553]: I0930 19:46:36.304632 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" podStartSLOduration=4.902745758 podStartE2EDuration="43.304601961s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.698378195 +0000 UTC m=+809.897880325" lastFinishedPulling="2025-09-30 19:46:35.100234378 +0000 UTC m=+848.299736528" observedRunningTime="2025-09-30 19:46:36.298687693 +0000 UTC m=+849.498189853" watchObservedRunningTime="2025-09-30 19:46:36.304601961 +0000 UTC m=+849.504104131" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.272571 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hqf8l" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="registry-server" containerID="cri-o://454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497" gracePeriod=2 Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.273018 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" event={"ID":"e5168b35-e85c-47e3-a641-e7003a2dbae7","Type":"ContainerStarted","Data":"71f3a04c52fec1c7c77c996f2249465fe03ff85a50e43fd54f75a85f4b6b0b7a"} Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.273655 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.293576 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" podStartSLOduration=4.144227224 podStartE2EDuration="44.293557559s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:56.876486797 +0000 UTC m=+810.075988927" lastFinishedPulling="2025-09-30 19:46:37.025817092 +0000 UTC m=+850.225319262" observedRunningTime="2025-09-30 19:46:37.290149597 +0000 UTC m=+850.489651737" watchObservedRunningTime="2025-09-30 19:46:37.293557559 +0000 UTC m=+850.493059689" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.786326 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.820893 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8lht\" (UniqueName: \"kubernetes.io/projected/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-kube-api-access-h8lht\") pod \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.820952 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-utilities\") pod \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.820981 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-catalog-content\") pod \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\" (UID: \"6d236a26-bc53-4b88-a8f0-ef72f5ea899b\") " Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.828449 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-utilities" (OuterVolumeSpecName: "utilities") pod "6d236a26-bc53-4b88-a8f0-ef72f5ea899b" (UID: "6d236a26-bc53-4b88-a8f0-ef72f5ea899b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.832695 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-kube-api-access-h8lht" (OuterVolumeSpecName: "kube-api-access-h8lht") pod "6d236a26-bc53-4b88-a8f0-ef72f5ea899b" (UID: "6d236a26-bc53-4b88-a8f0-ef72f5ea899b"). InnerVolumeSpecName "kube-api-access-h8lht". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.876557 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d236a26-bc53-4b88-a8f0-ef72f5ea899b" (UID: "6d236a26-bc53-4b88-a8f0-ef72f5ea899b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.930621 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8lht\" (UniqueName: \"kubernetes.io/projected/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-kube-api-access-h8lht\") on node \"crc\" DevicePath \"\"" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.930649 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:46:37 crc kubenswrapper[4553]: I0930 19:46:37.931656 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d236a26-bc53-4b88-a8f0-ef72f5ea899b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.284654 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" event={"ID":"e973f7e5-4256-4b75-8f51-e01ca131eeca","Type":"ContainerStarted","Data":"1239974bdde1d38b32bcff4d499d7e5506c2d0d7ee622f60826601dca8baeaa7"} Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.285783 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.291652 4553 generic.go:334] "Generic (PLEG): container finished" podID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerID="454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497" exitCode=0 Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.291687 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hqf8l" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.291729 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqf8l" event={"ID":"6d236a26-bc53-4b88-a8f0-ef72f5ea899b","Type":"ContainerDied","Data":"454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497"} Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.291782 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hqf8l" event={"ID":"6d236a26-bc53-4b88-a8f0-ef72f5ea899b","Type":"ContainerDied","Data":"42369be7e9139f64015536799b46e87296b1b23533cdce8eced83bff5d26e764"} Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.291825 4553 scope.go:117] "RemoveContainer" containerID="454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.325557 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" podStartSLOduration=3.637679266 podStartE2EDuration="45.325493577s" podCreationTimestamp="2025-09-30 19:45:53 +0000 UTC" firstStartedPulling="2025-09-30 19:45:55.480122806 +0000 UTC m=+808.679624936" lastFinishedPulling="2025-09-30 19:46:37.167937107 +0000 UTC m=+850.367439247" observedRunningTime="2025-09-30 19:46:38.307508775 +0000 UTC m=+851.507010965" watchObservedRunningTime="2025-09-30 19:46:38.325493577 +0000 UTC m=+851.524995737" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.344160 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hqf8l"] Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.344892 4553 scope.go:117] "RemoveContainer" containerID="fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.349314 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hqf8l"] Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.374663 4553 scope.go:117] "RemoveContainer" containerID="adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.417776 4553 scope.go:117] "RemoveContainer" containerID="454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497" Sep 30 19:46:38 crc kubenswrapper[4553]: E0930 19:46:38.418269 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497\": container with ID starting with 454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497 not found: ID does not exist" containerID="454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.418355 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497"} err="failed to get container status \"454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497\": rpc error: code = NotFound desc = could not find container \"454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497\": container with ID starting with 454157c0a5f410ba46d1fffefc1ac80b4b25b4c00757abddf231a3e747860497 not found: ID does not exist" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.418400 4553 scope.go:117] "RemoveContainer" containerID="fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a" Sep 30 19:46:38 crc kubenswrapper[4553]: E0930 19:46:38.418814 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a\": container with ID starting with fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a not found: ID does not exist" containerID="fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.418856 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a"} err="failed to get container status \"fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a\": rpc error: code = NotFound desc = could not find container \"fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a\": container with ID starting with fac6f4e2d6dd19846330cd1fb59f8a146d441b0e2e69e29a1302693cfc51156a not found: ID does not exist" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.418883 4553 scope.go:117] "RemoveContainer" containerID="adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388" Sep 30 19:46:38 crc kubenswrapper[4553]: E0930 19:46:38.419482 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388\": container with ID starting with adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388 not found: ID does not exist" containerID="adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388" Sep 30 19:46:38 crc kubenswrapper[4553]: I0930 19:46:38.419506 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388"} err="failed to get container status \"adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388\": rpc error: code = NotFound desc = could not find container \"adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388\": container with ID starting with adee2b3e22ad48992d67825d028f8b2ebe740b77c83524e0aa4eaafa5e782388 not found: ID does not exist" Sep 30 19:46:39 crc kubenswrapper[4553]: I0930 19:46:39.521170 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" path="/var/lib/kubelet/pods/6d236a26-bc53-4b88-a8f0-ef72f5ea899b/volumes" Sep 30 19:46:43 crc kubenswrapper[4553]: I0930 19:46:43.824197 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-r2zwk" Sep 30 19:46:44 crc kubenswrapper[4553]: I0930 19:46:44.103409 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-sjwqp" Sep 30 19:46:44 crc kubenswrapper[4553]: I0930 19:46:44.832509 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-8cnk4" Sep 30 19:46:45 crc kubenswrapper[4553]: I0930 19:46:45.523768 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:46:45 crc kubenswrapper[4553]: I0930 19:46:45.527335 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-tk4kk" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.712915 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l96lx"] Sep 30 19:47:04 crc kubenswrapper[4553]: E0930 19:47:04.713825 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="registry-server" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.713840 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="registry-server" Sep 30 19:47:04 crc kubenswrapper[4553]: E0930 19:47:04.713866 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="extract-utilities" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.713872 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="extract-utilities" Sep 30 19:47:04 crc kubenswrapper[4553]: E0930 19:47:04.713886 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="extract-content" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.713892 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="extract-content" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.714087 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d236a26-bc53-4b88-a8f0-ef72f5ea899b" containerName="registry-server" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.714870 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.724010 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.726011 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.726451 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.733712 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wg799" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.745198 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l96lx"] Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.822150 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-cp6js"] Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.823637 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.824215 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfrg7\" (UniqueName: \"kubernetes.io/projected/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-kube-api-access-mfrg7\") pod \"dnsmasq-dns-675f4bcbfc-l96lx\" (UID: \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.824313 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-config\") pod \"dnsmasq-dns-675f4bcbfc-l96lx\" (UID: \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.825588 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.843258 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-cp6js"] Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.925648 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfrg7\" (UniqueName: \"kubernetes.io/projected/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-kube-api-access-mfrg7\") pod \"dnsmasq-dns-675f4bcbfc-l96lx\" (UID: \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.925722 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-config\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.925748 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-config\") pod \"dnsmasq-dns-675f4bcbfc-l96lx\" (UID: \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.925781 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.925807 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdtr\" (UniqueName: \"kubernetes.io/projected/5ed1d18d-8390-45cd-baa9-94ba69b32def-kube-api-access-krdtr\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.926829 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-config\") pod \"dnsmasq-dns-675f4bcbfc-l96lx\" (UID: \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:04 crc kubenswrapper[4553]: I0930 19:47:04.955696 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfrg7\" (UniqueName: \"kubernetes.io/projected/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-kube-api-access-mfrg7\") pod \"dnsmasq-dns-675f4bcbfc-l96lx\" (UID: \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.027206 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.027261 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krdtr\" (UniqueName: \"kubernetes.io/projected/5ed1d18d-8390-45cd-baa9-94ba69b32def-kube-api-access-krdtr\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.027326 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-config\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.028108 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-config\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.028493 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.042841 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdtr\" (UniqueName: \"kubernetes.io/projected/5ed1d18d-8390-45cd-baa9-94ba69b32def-kube-api-access-krdtr\") pod \"dnsmasq-dns-78dd6ddcc-cp6js\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.045600 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.138323 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.499774 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l96lx"] Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.504948 4553 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.563359 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" event={"ID":"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f","Type":"ContainerStarted","Data":"b93f1a0b172cda7dd92745bc1b651060f941d7c13d3b66a065778578a4a686c6"} Sep 30 19:47:05 crc kubenswrapper[4553]: I0930 19:47:05.588589 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-cp6js"] Sep 30 19:47:06 crc kubenswrapper[4553]: I0930 19:47:06.571032 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" event={"ID":"5ed1d18d-8390-45cd-baa9-94ba69b32def","Type":"ContainerStarted","Data":"4267bd52384eea02ea2c028059c56380f9f7e5076244c5e1f4e0226eeec11948"} Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.726973 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l96lx"] Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.745139 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ltb2t"] Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.746551 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.760536 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ltb2t"] Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.796732 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jzrq\" (UniqueName: \"kubernetes.io/projected/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-kube-api-access-6jzrq\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.796773 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-config\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.796890 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.897895 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jzrq\" (UniqueName: \"kubernetes.io/projected/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-kube-api-access-6jzrq\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.897946 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-config\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.897980 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.898815 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.898829 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-config\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:07 crc kubenswrapper[4553]: I0930 19:47:07.919679 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jzrq\" (UniqueName: \"kubernetes.io/projected/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-kube-api-access-6jzrq\") pod \"dnsmasq-dns-5ccc8479f9-ltb2t\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.075933 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-cp6js"] Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.082402 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.133179 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7m29p"] Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.134372 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.150618 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7m29p"] Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.206260 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-config\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.206320 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.206343 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4np94\" (UniqueName: \"kubernetes.io/projected/bd5b9d77-f719-44fa-ad65-3562931d6e37-kube-api-access-4np94\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.307707 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-config\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.308387 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.308425 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4np94\" (UniqueName: \"kubernetes.io/projected/bd5b9d77-f719-44fa-ad65-3562931d6e37-kube-api-access-4np94\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.308604 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-config\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.314704 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.327221 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4np94\" (UniqueName: \"kubernetes.io/projected/bd5b9d77-f719-44fa-ad65-3562931d6e37-kube-api-access-4np94\") pod \"dnsmasq-dns-57d769cc4f-7m29p\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.468696 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.721985 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ltb2t"] Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.911126 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.912971 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.916423 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.916661 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.916679 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.916770 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.916781 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.916850 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.920407 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 19:47:08 crc kubenswrapper[4553]: I0930 19:47:08.921128 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-phdzt" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.008184 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7m29p"] Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024611 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024666 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5bde6e85-a37e-4cec-a759-b0cd4eea2807-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024685 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024718 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024740 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5bde6e85-a37e-4cec-a759-b0cd4eea2807-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024784 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024801 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024820 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024887 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024907 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxlh8\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-kube-api-access-wxlh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.024925 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126254 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5bde6e85-a37e-4cec-a759-b0cd4eea2807-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126298 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126327 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126346 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5bde6e85-a37e-4cec-a759-b0cd4eea2807-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126390 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126408 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126529 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126547 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126607 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxlh8\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-kube-api-access-wxlh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126643 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.126682 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.127948 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.128033 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.128256 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.128436 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.128618 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.129792 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.134404 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.135184 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5bde6e85-a37e-4cec-a759-b0cd4eea2807-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.139521 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5bde6e85-a37e-4cec-a759-b0cd4eea2807-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.140057 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.157852 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxlh8\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-kube-api-access-wxlh8\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.159819 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.271071 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.291254 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.294482 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.298492 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z6vj8" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.298671 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.298853 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.298988 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.299156 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.299295 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.299477 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.302334 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.431425 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxwtv\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-kube-api-access-gxwtv\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.431836 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-config-data\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.431915 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.431941 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.432123 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.432164 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c4de23a-3df4-47a2-86f1-436a8b11c22d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.432201 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.432224 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.432255 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.432443 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.432632 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c4de23a-3df4-47a2-86f1-436a8b11c22d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534182 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534224 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c4de23a-3df4-47a2-86f1-436a8b11c22d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534257 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534273 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534326 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534346 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534374 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c4de23a-3df4-47a2-86f1-436a8b11c22d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534396 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-config-data\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534412 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxwtv\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-kube-api-access-gxwtv\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534414 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534432 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.534468 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.535356 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.535737 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-config-data\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.535862 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.535928 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.536147 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.539448 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.540017 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c4de23a-3df4-47a2-86f1-436a8b11c22d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.540778 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.543317 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c4de23a-3df4-47a2-86f1-436a8b11c22d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.550731 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxwtv\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-kube-api-access-gxwtv\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.563836 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " pod="openstack/rabbitmq-server-0" Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.614198 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" event={"ID":"bd5b9d77-f719-44fa-ad65-3562931d6e37","Type":"ContainerStarted","Data":"e80d86ef5ba8c9dd5db695a069af40932350c70939bb91c36f4b0541d0558fe5"} Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.615456 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" event={"ID":"ba2bd56e-0ea5-428c-b646-3396dfdb35bf","Type":"ContainerStarted","Data":"9df2fb285341680c99799de94918f2b92fe3d7f2a842602231bed0b924eae639"} Sep 30 19:47:09 crc kubenswrapper[4553]: I0930 19:47:09.625874 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.256252 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.257963 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.259829 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-n5xdt" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.262802 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.263005 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.263377 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.263543 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.266664 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.278028 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364746 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364787 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364813 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364831 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364849 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364870 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-secrets\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364890 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkmxq\" (UniqueName: \"kubernetes.io/projected/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-kube-api-access-jkmxq\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364938 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.364954 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.465989 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.466688 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.466722 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.466744 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.466762 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.466784 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-secrets\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.466804 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkmxq\" (UniqueName: \"kubernetes.io/projected/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-kube-api-access-jkmxq\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.466888 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.466903 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.467110 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.467516 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.468266 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.469446 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-kolla-config\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.469728 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-config-data-default\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.472101 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.485321 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkmxq\" (UniqueName: \"kubernetes.io/projected/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-kube-api-access-jkmxq\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.486820 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.500482 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.519193 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d0712b30-32a7-4e50-b263-c4b3d92b6f0e-secrets\") pod \"openstack-galera-0\" (UID: \"d0712b30-32a7-4e50-b263-c4b3d92b6f0e\") " pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.627487 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.668820 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.670431 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.672334 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mscf5" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.673381 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.674305 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.674500 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.697777 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772133 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772211 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772267 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a92d686d-50a7-44ab-80e0-5e5ee452045c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772319 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772339 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwjsz\" (UniqueName: \"kubernetes.io/projected/a92d686d-50a7-44ab-80e0-5e5ee452045c-kube-api-access-hwjsz\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772360 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772386 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772444 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.772467 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.873850 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874237 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874278 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874440 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874485 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a92d686d-50a7-44ab-80e0-5e5ee452045c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874516 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874534 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwjsz\" (UniqueName: \"kubernetes.io/projected/a92d686d-50a7-44ab-80e0-5e5ee452045c-kube-api-access-hwjsz\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874550 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874570 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874586 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.874655 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.875015 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.876368 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a92d686d-50a7-44ab-80e0-5e5ee452045c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.876688 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a92d686d-50a7-44ab-80e0-5e5ee452045c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.878774 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.879965 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.887519 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92d686d-50a7-44ab-80e0-5e5ee452045c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.893762 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwjsz\" (UniqueName: \"kubernetes.io/projected/a92d686d-50a7-44ab-80e0-5e5ee452045c-kube-api-access-hwjsz\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.894500 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"a92d686d-50a7-44ab-80e0-5e5ee452045c\") " pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:11 crc kubenswrapper[4553]: I0930 19:47:11.994349 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.194159 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.195032 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.196898 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-jbfc6" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.197015 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.198494 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.216113 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.280438 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879abe38-75bc-4f92-9b0a-52524daadaee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.280526 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/879abe38-75bc-4f92-9b0a-52524daadaee-config-data\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.280554 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b82w6\" (UniqueName: \"kubernetes.io/projected/879abe38-75bc-4f92-9b0a-52524daadaee-kube-api-access-b82w6\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.280586 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/879abe38-75bc-4f92-9b0a-52524daadaee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.280608 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/879abe38-75bc-4f92-9b0a-52524daadaee-kolla-config\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.381551 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b82w6\" (UniqueName: \"kubernetes.io/projected/879abe38-75bc-4f92-9b0a-52524daadaee-kube-api-access-b82w6\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.381626 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/879abe38-75bc-4f92-9b0a-52524daadaee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.381653 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/879abe38-75bc-4f92-9b0a-52524daadaee-kolla-config\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.381690 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879abe38-75bc-4f92-9b0a-52524daadaee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.381747 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/879abe38-75bc-4f92-9b0a-52524daadaee-config-data\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.382558 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/879abe38-75bc-4f92-9b0a-52524daadaee-config-data\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.382811 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/879abe38-75bc-4f92-9b0a-52524daadaee-kolla-config\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.385446 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/879abe38-75bc-4f92-9b0a-52524daadaee-memcached-tls-certs\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.385913 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/879abe38-75bc-4f92-9b0a-52524daadaee-combined-ca-bundle\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.400446 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b82w6\" (UniqueName: \"kubernetes.io/projected/879abe38-75bc-4f92-9b0a-52524daadaee-kube-api-access-b82w6\") pod \"memcached-0\" (UID: \"879abe38-75bc-4f92-9b0a-52524daadaee\") " pod="openstack/memcached-0" Sep 30 19:47:12 crc kubenswrapper[4553]: I0930 19:47:12.510131 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 19:47:13 crc kubenswrapper[4553]: I0930 19:47:13.734341 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:47:13 crc kubenswrapper[4553]: I0930 19:47:13.735433 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 19:47:13 crc kubenswrapper[4553]: I0930 19:47:13.737070 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nmrcz" Sep 30 19:47:13 crc kubenswrapper[4553]: I0930 19:47:13.749988 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:47:13 crc kubenswrapper[4553]: I0930 19:47:13.808141 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwst9\" (UniqueName: \"kubernetes.io/projected/c828a401-ebca-4e9d-850e-d6f74d380257-kube-api-access-vwst9\") pod \"kube-state-metrics-0\" (UID: \"c828a401-ebca-4e9d-850e-d6f74d380257\") " pod="openstack/kube-state-metrics-0" Sep 30 19:47:13 crc kubenswrapper[4553]: I0930 19:47:13.912224 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwst9\" (UniqueName: \"kubernetes.io/projected/c828a401-ebca-4e9d-850e-d6f74d380257-kube-api-access-vwst9\") pod \"kube-state-metrics-0\" (UID: \"c828a401-ebca-4e9d-850e-d6f74d380257\") " pod="openstack/kube-state-metrics-0" Sep 30 19:47:13 crc kubenswrapper[4553]: I0930 19:47:13.936933 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwst9\" (UniqueName: \"kubernetes.io/projected/c828a401-ebca-4e9d-850e-d6f74d380257-kube-api-access-vwst9\") pod \"kube-state-metrics-0\" (UID: \"c828a401-ebca-4e9d-850e-d6f74d380257\") " pod="openstack/kube-state-metrics-0" Sep 30 19:47:14 crc kubenswrapper[4553]: I0930 19:47:14.050815 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.268919 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r4k44"] Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.270387 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.272507 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.273030 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.274777 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nqpmj" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.276981 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zwpmt"] Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.278857 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.297907 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r4k44"] Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.313007 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zwpmt"] Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.432633 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-log-ovn\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.432706 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-run-ovn\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.432732 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-run\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.432756 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-log\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.432783 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-etc-ovs\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.432809 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e6cc85b-124a-415e-a4f1-17219da3165c-ovn-controller-tls-certs\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.432845 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbcpz\" (UniqueName: \"kubernetes.io/projected/9e6cc85b-124a-415e-a4f1-17219da3165c-kube-api-access-bbcpz\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.433069 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e06ee589-214b-45a3-ab70-b71c4dfba2f9-scripts\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.433127 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-lib\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.433144 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-run\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.433189 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6cc85b-124a-415e-a4f1-17219da3165c-scripts\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.433274 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6cc85b-124a-415e-a4f1-17219da3165c-combined-ca-bundle\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.433385 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj5sx\" (UniqueName: \"kubernetes.io/projected/e06ee589-214b-45a3-ab70-b71c4dfba2f9-kube-api-access-bj5sx\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.535161 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-log\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.535223 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-etc-ovs\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.535256 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e6cc85b-124a-415e-a4f1-17219da3165c-ovn-controller-tls-certs\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.535296 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbcpz\" (UniqueName: \"kubernetes.io/projected/9e6cc85b-124a-415e-a4f1-17219da3165c-kube-api-access-bbcpz\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.535784 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-log\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.535823 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-etc-ovs\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.537350 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e06ee589-214b-45a3-ab70-b71c4dfba2f9-scripts\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.537719 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-lib\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.537751 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-run\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.537811 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6cc85b-124a-415e-a4f1-17219da3165c-scripts\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.537838 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6cc85b-124a-415e-a4f1-17219da3165c-combined-ca-bundle\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.537908 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj5sx\" (UniqueName: \"kubernetes.io/projected/e06ee589-214b-45a3-ab70-b71c4dfba2f9-kube-api-access-bj5sx\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.537977 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-log-ovn\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.538021 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-run-ovn\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.538076 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-run\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.538464 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-run\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.538753 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-run\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.538873 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-log-ovn\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.538919 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e06ee589-214b-45a3-ab70-b71c4dfba2f9-var-lib\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.538978 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6cc85b-124a-415e-a4f1-17219da3165c-var-run-ovn\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.539542 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e06ee589-214b-45a3-ab70-b71c4dfba2f9-scripts\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.540824 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6cc85b-124a-415e-a4f1-17219da3165c-scripts\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.545934 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6cc85b-124a-415e-a4f1-17219da3165c-combined-ca-bundle\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.552735 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e6cc85b-124a-415e-a4f1-17219da3165c-ovn-controller-tls-certs\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.557945 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj5sx\" (UniqueName: \"kubernetes.io/projected/e06ee589-214b-45a3-ab70-b71c4dfba2f9-kube-api-access-bj5sx\") pod \"ovn-controller-ovs-zwpmt\" (UID: \"e06ee589-214b-45a3-ab70-b71c4dfba2f9\") " pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.563596 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbcpz\" (UniqueName: \"kubernetes.io/projected/9e6cc85b-124a-415e-a4f1-17219da3165c-kube-api-access-bbcpz\") pod \"ovn-controller-r4k44\" (UID: \"9e6cc85b-124a-415e-a4f1-17219da3165c\") " pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.593610 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r4k44" Sep 30 19:47:18 crc kubenswrapper[4553]: I0930 19:47:18.606639 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.147924 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.149238 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.152327 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.152557 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.152638 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.152710 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.152714 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wmjr8" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.158017 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.251332 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-config\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.251376 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.251406 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.251484 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.251500 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw7cm\" (UniqueName: \"kubernetes.io/projected/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-kube-api-access-xw7cm\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.251515 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.251531 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.251554 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.352311 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.352366 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.352438 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-config\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.352455 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.352479 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.352513 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.352528 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw7cm\" (UniqueName: \"kubernetes.io/projected/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-kube-api-access-xw7cm\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.352542 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.358113 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.358151 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.358173 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.358337 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.359340 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.368619 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.374739 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-config\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.384986 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.385426 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw7cm\" (UniqueName: \"kubernetes.io/projected/2ace1318-99e8-4ab2-9244-ed0ca49e89d5-kube-api-access-xw7cm\") pod \"ovsdbserver-nb-0\" (UID: \"2ace1318-99e8-4ab2-9244-ed0ca49e89d5\") " pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:19 crc kubenswrapper[4553]: I0930 19:47:19.480457 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.524058 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.526933 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.534819 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.535027 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lg6cw" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.535191 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.535189 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.556210 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.672986 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee750363-8434-413d-9bc9-fee0218e2e1b-config\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.673071 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.673148 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee750363-8434-413d-9bc9-fee0218e2e1b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.673209 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.673230 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqdm\" (UniqueName: \"kubernetes.io/projected/ee750363-8434-413d-9bc9-fee0218e2e1b-kube-api-access-thqdm\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.673261 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ee750363-8434-413d-9bc9-fee0218e2e1b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.673317 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.673368 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.774804 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.774860 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee750363-8434-413d-9bc9-fee0218e2e1b-config\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.774887 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.774917 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee750363-8434-413d-9bc9-fee0218e2e1b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.774956 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.774981 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdm\" (UniqueName: \"kubernetes.io/projected/ee750363-8434-413d-9bc9-fee0218e2e1b-kube-api-access-thqdm\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.775008 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ee750363-8434-413d-9bc9-fee0218e2e1b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.775051 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.775315 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.776505 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee750363-8434-413d-9bc9-fee0218e2e1b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.778114 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee750363-8434-413d-9bc9-fee0218e2e1b-config\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.778382 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ee750363-8434-413d-9bc9-fee0218e2e1b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.783822 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.788619 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.790258 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee750363-8434-413d-9bc9-fee0218e2e1b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.827608 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqdm\" (UniqueName: \"kubernetes.io/projected/ee750363-8434-413d-9bc9-fee0218e2e1b-kube-api-access-thqdm\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.843591 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ee750363-8434-413d-9bc9-fee0218e2e1b\") " pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:20 crc kubenswrapper[4553]: I0930 19:47:20.855063 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:22 crc kubenswrapper[4553]: E0930 19:47:22.590880 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 19:47:22 crc kubenswrapper[4553]: E0930 19:47:22.591081 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mfrg7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-l96lx_openstack(4e77e2cf-0071-4293-ab5d-22c19d2a0f3f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:47:22 crc kubenswrapper[4553]: E0930 19:47:22.592256 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" podUID="4e77e2cf-0071-4293-ab5d-22c19d2a0f3f" Sep 30 19:47:22 crc kubenswrapper[4553]: E0930 19:47:22.619793 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 19:47:22 crc kubenswrapper[4553]: E0930 19:47:22.620015 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krdtr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-cp6js_openstack(5ed1d18d-8390-45cd-baa9-94ba69b32def): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:47:22 crc kubenswrapper[4553]: E0930 19:47:22.621865 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" podUID="5ed1d18d-8390-45cd-baa9-94ba69b32def" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.294176 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.342280 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.439653 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krdtr\" (UniqueName: \"kubernetes.io/projected/5ed1d18d-8390-45cd-baa9-94ba69b32def-kube-api-access-krdtr\") pod \"5ed1d18d-8390-45cd-baa9-94ba69b32def\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.439787 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-config\") pod \"5ed1d18d-8390-45cd-baa9-94ba69b32def\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.439842 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-dns-svc\") pod \"5ed1d18d-8390-45cd-baa9-94ba69b32def\" (UID: \"5ed1d18d-8390-45cd-baa9-94ba69b32def\") " Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.440283 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ed1d18d-8390-45cd-baa9-94ba69b32def" (UID: "5ed1d18d-8390-45cd-baa9-94ba69b32def"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.440300 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-config" (OuterVolumeSpecName: "config") pod "5ed1d18d-8390-45cd-baa9-94ba69b32def" (UID: "5ed1d18d-8390-45cd-baa9-94ba69b32def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.440544 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.440561 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ed1d18d-8390-45cd-baa9-94ba69b32def-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.445222 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed1d18d-8390-45cd-baa9-94ba69b32def-kube-api-access-krdtr" (OuterVolumeSpecName: "kube-api-access-krdtr") pod "5ed1d18d-8390-45cd-baa9-94ba69b32def" (UID: "5ed1d18d-8390-45cd-baa9-94ba69b32def"). InnerVolumeSpecName "kube-api-access-krdtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.464882 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 19:47:23 crc kubenswrapper[4553]: W0930 19:47:23.466848 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0712b30_32a7_4e50_b263_c4b3d92b6f0e.slice/crio-8f719004373db26aad94c68abe0a30045c77291b209fa677e31957b822810dc8 WatchSource:0}: Error finding container 8f719004373db26aad94c68abe0a30045c77291b209fa677e31957b822810dc8: Status 404 returned error can't find the container with id 8f719004373db26aad94c68abe0a30045c77291b209fa677e31957b822810dc8 Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.478622 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.479521 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.542006 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krdtr\" (UniqueName: \"kubernetes.io/projected/5ed1d18d-8390-45cd-baa9-94ba69b32def-kube-api-access-krdtr\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.642899 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-config\") pod \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\" (UID: \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\") " Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.642948 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfrg7\" (UniqueName: \"kubernetes.io/projected/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-kube-api-access-mfrg7\") pod \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\" (UID: \"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f\") " Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.644150 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-config" (OuterVolumeSpecName: "config") pod "4e77e2cf-0071-4293-ab5d-22c19d2a0f3f" (UID: "4e77e2cf-0071-4293-ab5d-22c19d2a0f3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.652154 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.653380 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-kube-api-access-mfrg7" (OuterVolumeSpecName: "kube-api-access-mfrg7") pod "4e77e2cf-0071-4293-ab5d-22c19d2a0f3f" (UID: "4e77e2cf-0071-4293-ab5d-22c19d2a0f3f"). InnerVolumeSpecName "kube-api-access-mfrg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.664420 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.670001 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:47:23 crc kubenswrapper[4553]: W0930 19:47:23.671422 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c4de23a_3df4_47a2_86f1_436a8b11c22d.slice/crio-2417903089be013d11ef24537688194758861d1cdf3a423447473d98b078380a WatchSource:0}: Error finding container 2417903089be013d11ef24537688194758861d1cdf3a423447473d98b078380a: Status 404 returned error can't find the container with id 2417903089be013d11ef24537688194758861d1cdf3a423447473d98b078380a Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.690229 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r4k44"] Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.720382 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a92d686d-50a7-44ab-80e0-5e5ee452045c","Type":"ContainerStarted","Data":"6b9a29f390fea2c8ba9a876382058bb0373001f6d68784b34ca5266970a536f9"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.721454 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c828a401-ebca-4e9d-850e-d6f74d380257","Type":"ContainerStarted","Data":"7157764247c0e01afa00a9fa4c140f85e0657f7160647fe86a27263990541d94"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.722815 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"879abe38-75bc-4f92-9b0a-52524daadaee","Type":"ContainerStarted","Data":"92023635b6a123776be8e22df8580410ae4d013c627586a66dcff15231f6cf9f"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.723662 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r4k44" event={"ID":"9e6cc85b-124a-415e-a4f1-17219da3165c","Type":"ContainerStarted","Data":"844d3afe20176a5452e8b7e0aaa120f1118e60a00ba32a47897bbb08dc76d824"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.724440 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0712b30-32a7-4e50-b263-c4b3d92b6f0e","Type":"ContainerStarted","Data":"8f719004373db26aad94c68abe0a30045c77291b209fa677e31957b822810dc8"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.725244 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" event={"ID":"4e77e2cf-0071-4293-ab5d-22c19d2a0f3f","Type":"ContainerDied","Data":"b93f1a0b172cda7dd92745bc1b651060f941d7c13d3b66a065778578a4a686c6"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.725320 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-l96lx" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.727987 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c4de23a-3df4-47a2-86f1-436a8b11c22d","Type":"ContainerStarted","Data":"2417903089be013d11ef24537688194758861d1cdf3a423447473d98b078380a"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.730376 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bde6e85-a37e-4cec-a759-b0cd4eea2807","Type":"ContainerStarted","Data":"351e46539d24d282f706b3cdbf5f9b471b552a89e7740a87a6d0e65ccb82fd01"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.732566 4553 generic.go:334] "Generic (PLEG): container finished" podID="bd5b9d77-f719-44fa-ad65-3562931d6e37" containerID="6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9" exitCode=0 Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.732619 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" event={"ID":"bd5b9d77-f719-44fa-ad65-3562931d6e37","Type":"ContainerDied","Data":"6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.739071 4553 generic.go:334] "Generic (PLEG): container finished" podID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" containerID="417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202" exitCode=0 Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.739126 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" event={"ID":"ba2bd56e-0ea5-428c-b646-3396dfdb35bf","Type":"ContainerDied","Data":"417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.739922 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" event={"ID":"5ed1d18d-8390-45cd-baa9-94ba69b32def","Type":"ContainerDied","Data":"4267bd52384eea02ea2c028059c56380f9f7e5076244c5e1f4e0226eeec11948"} Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.739975 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-cp6js" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.748188 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.748228 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfrg7\" (UniqueName: \"kubernetes.io/projected/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f-kube-api-access-mfrg7\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.799141 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l96lx"] Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.803402 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-l96lx"] Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.858217 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-cp6js"] Sep 30 19:47:23 crc kubenswrapper[4553]: I0930 19:47:23.878124 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-cp6js"] Sep 30 19:47:24 crc kubenswrapper[4553]: E0930 19:47:24.004542 4553 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 30 19:47:24 crc kubenswrapper[4553]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/ba2bd56e-0ea5-428c-b646-3396dfdb35bf/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 19:47:24 crc kubenswrapper[4553]: > podSandboxID="9df2fb285341680c99799de94918f2b92fe3d7f2a842602231bed0b924eae639" Sep 30 19:47:24 crc kubenswrapper[4553]: E0930 19:47:24.004687 4553 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 30 19:47:24 crc kubenswrapper[4553]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jzrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-ltb2t_openstack(ba2bd56e-0ea5-428c-b646-3396dfdb35bf): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/ba2bd56e-0ea5-428c-b646-3396dfdb35bf/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 19:47:24 crc kubenswrapper[4553]: > logger="UnhandledError" Sep 30 19:47:24 crc kubenswrapper[4553]: E0930 19:47:24.005756 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/ba2bd56e-0ea5-428c-b646-3396dfdb35bf/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" podUID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" Sep 30 19:47:24 crc kubenswrapper[4553]: I0930 19:47:24.103362 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 19:47:24 crc kubenswrapper[4553]: W0930 19:47:24.106691 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ace1318_99e8_4ab2_9244_ed0ca49e89d5.slice/crio-d41f42f361f1ddafb35e03c9b807979584ca7a76b5a9d9a99a5457a5ae298f24 WatchSource:0}: Error finding container d41f42f361f1ddafb35e03c9b807979584ca7a76b5a9d9a99a5457a5ae298f24: Status 404 returned error can't find the container with id d41f42f361f1ddafb35e03c9b807979584ca7a76b5a9d9a99a5457a5ae298f24 Sep 30 19:47:24 crc kubenswrapper[4553]: I0930 19:47:24.693371 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 19:47:24 crc kubenswrapper[4553]: I0930 19:47:24.748127 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2ace1318-99e8-4ab2-9244-ed0ca49e89d5","Type":"ContainerStarted","Data":"d41f42f361f1ddafb35e03c9b807979584ca7a76b5a9d9a99a5457a5ae298f24"} Sep 30 19:47:24 crc kubenswrapper[4553]: I0930 19:47:24.751077 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" event={"ID":"bd5b9d77-f719-44fa-ad65-3562931d6e37","Type":"ContainerStarted","Data":"b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7"} Sep 30 19:47:24 crc kubenswrapper[4553]: I0930 19:47:24.751102 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:24 crc kubenswrapper[4553]: I0930 19:47:24.790199 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" podStartSLOduration=3.026165446 podStartE2EDuration="16.790178598s" podCreationTimestamp="2025-09-30 19:47:08 +0000 UTC" firstStartedPulling="2025-09-30 19:47:09.017931699 +0000 UTC m=+882.217433829" lastFinishedPulling="2025-09-30 19:47:22.781944851 +0000 UTC m=+895.981446981" observedRunningTime="2025-09-30 19:47:24.787474156 +0000 UTC m=+897.986976286" watchObservedRunningTime="2025-09-30 19:47:24.790178598 +0000 UTC m=+897.989680728" Sep 30 19:47:24 crc kubenswrapper[4553]: I0930 19:47:24.846299 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zwpmt"] Sep 30 19:47:25 crc kubenswrapper[4553]: I0930 19:47:25.515399 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e77e2cf-0071-4293-ab5d-22c19d2a0f3f" path="/var/lib/kubelet/pods/4e77e2cf-0071-4293-ab5d-22c19d2a0f3f/volumes" Sep 30 19:47:25 crc kubenswrapper[4553]: I0930 19:47:25.516164 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed1d18d-8390-45cd-baa9-94ba69b32def" path="/var/lib/kubelet/pods/5ed1d18d-8390-45cd-baa9-94ba69b32def/volumes" Sep 30 19:47:25 crc kubenswrapper[4553]: I0930 19:47:25.761769 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ee750363-8434-413d-9bc9-fee0218e2e1b","Type":"ContainerStarted","Data":"16b486b7afa43219720405a4b9eff978c3a965da57f08429fc120eb9fe066d05"} Sep 30 19:47:26 crc kubenswrapper[4553]: I0930 19:47:26.773724 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zwpmt" event={"ID":"e06ee589-214b-45a3-ab70-b71c4dfba2f9","Type":"ContainerStarted","Data":"a1530d02d0a7b12b5a1dc5496e3a6cc6c381afff28dc875f9feb6b6fa9469263"} Sep 30 19:47:29 crc kubenswrapper[4553]: I0930 19:47:29.585711 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:47:29 crc kubenswrapper[4553]: I0930 19:47:29.586393 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:47:33 crc kubenswrapper[4553]: I0930 19:47:33.471425 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:47:33 crc kubenswrapper[4553]: I0930 19:47:33.550695 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ltb2t"] Sep 30 19:47:36 crc kubenswrapper[4553]: E0930 19:47:36.914393 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Sep 30 19:47:36 crc kubenswrapper[4553]: E0930 19:47:36.914908 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n57fh64h599h5bdhfch666hb8hbhf4h74h66fhc9hd6h54dh676h5bdh9bh5dchb9h596h74h4hf6h68hcbh689h5fbh675hbdh67bh55fh559q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bbcpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-r4k44_openstack(9e6cc85b-124a-415e-a4f1-17219da3165c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:47:36 crc kubenswrapper[4553]: E0930 19:47:36.916175 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-r4k44" podUID="9e6cc85b-124a-415e-a4f1-17219da3165c" Sep 30 19:47:37 crc kubenswrapper[4553]: E0930 19:47:37.904564 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-r4k44" podUID="9e6cc85b-124a-415e-a4f1-17219da3165c" Sep 30 19:47:38 crc kubenswrapper[4553]: E0930 19:47:38.531342 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Sep 30 19:47:38 crc kubenswrapper[4553]: E0930 19:47:38.531440 4553 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Sep 30 19:47:38 crc kubenswrapper[4553]: E0930 19:47:38.531646 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vwst9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(c828a401-ebca-4e9d-850e-d6f74d380257): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 19:47:38 crc kubenswrapper[4553]: E0930 19:47:38.533246 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="c828a401-ebca-4e9d-850e-d6f74d380257" Sep 30 19:47:38 crc kubenswrapper[4553]: I0930 19:47:38.908579 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zwpmt" event={"ID":"e06ee589-214b-45a3-ab70-b71c4dfba2f9","Type":"ContainerStarted","Data":"58f2c04b30b0f5f88149da4b3c668146e43cff231330f02f09beaa2250362c4a"} Sep 30 19:47:38 crc kubenswrapper[4553]: E0930 19:47:38.910365 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="c828a401-ebca-4e9d-850e-d6f74d380257" Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.920808 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" event={"ID":"ba2bd56e-0ea5-428c-b646-3396dfdb35bf","Type":"ContainerStarted","Data":"8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d"} Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.921138 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.920909 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" podUID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" containerName="dnsmasq-dns" containerID="cri-o://8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d" gracePeriod=10 Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.924275 4553 generic.go:334] "Generic (PLEG): container finished" podID="e06ee589-214b-45a3-ab70-b71c4dfba2f9" containerID="58f2c04b30b0f5f88149da4b3c668146e43cff231330f02f09beaa2250362c4a" exitCode=0 Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.924417 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zwpmt" event={"ID":"e06ee589-214b-45a3-ab70-b71c4dfba2f9","Type":"ContainerDied","Data":"58f2c04b30b0f5f88149da4b3c668146e43cff231330f02f09beaa2250362c4a"} Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.933195 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0712b30-32a7-4e50-b263-c4b3d92b6f0e","Type":"ContainerStarted","Data":"2156ee99bcc51357a22e63bee59973fa5421b4108bdf553fb7dc5f29e858dfdb"} Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.944581 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a92d686d-50a7-44ab-80e0-5e5ee452045c","Type":"ContainerStarted","Data":"94345eb8b4ebb0c4c7bcba5291bf6e07679a0c120329eca49ed8395fd6d178f9"} Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.947409 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" podStartSLOduration=18.969387897 podStartE2EDuration="32.947385917s" podCreationTimestamp="2025-09-30 19:47:07 +0000 UTC" firstStartedPulling="2025-09-30 19:47:08.772780077 +0000 UTC m=+881.972282207" lastFinishedPulling="2025-09-30 19:47:22.750778097 +0000 UTC m=+895.950280227" observedRunningTime="2025-09-30 19:47:39.939241279 +0000 UTC m=+913.138743409" watchObservedRunningTime="2025-09-30 19:47:39.947385917 +0000 UTC m=+913.146888057" Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.962936 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2ace1318-99e8-4ab2-9244-ed0ca49e89d5","Type":"ContainerStarted","Data":"23af03a96a41f4948ab9ec30dec6ab5d5d683dcadae028ee6377adfd7ddbc0fb"} Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.964710 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bde6e85-a37e-4cec-a759-b0cd4eea2807","Type":"ContainerStarted","Data":"56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484"} Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.967358 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"879abe38-75bc-4f92-9b0a-52524daadaee","Type":"ContainerStarted","Data":"88c408e46de46a47cb6bbfe6b8ef854275e37f939d6de35f38b44f1d6d0bcfd4"} Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.967513 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.973486 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c4de23a-3df4-47a2-86f1-436a8b11c22d","Type":"ContainerStarted","Data":"d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85"} Sep 30 19:47:39 crc kubenswrapper[4553]: I0930 19:47:39.977016 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ee750363-8434-413d-9bc9-fee0218e2e1b","Type":"ContainerStarted","Data":"7872bccba6467f3b98fb0b1cd31b58ce6890659fa2ffa8e2118bc7fbb9f7cf66"} Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.110509 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.884926191 podStartE2EDuration="28.110489842s" podCreationTimestamp="2025-09-30 19:47:12 +0000 UTC" firstStartedPulling="2025-09-30 19:47:23.306594483 +0000 UTC m=+896.506096613" lastFinishedPulling="2025-09-30 19:47:36.532158104 +0000 UTC m=+909.731660264" observedRunningTime="2025-09-30 19:47:40.109741643 +0000 UTC m=+913.309243773" watchObservedRunningTime="2025-09-30 19:47:40.110489842 +0000 UTC m=+913.309991972" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.463160 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.512562 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jzrq\" (UniqueName: \"kubernetes.io/projected/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-kube-api-access-6jzrq\") pod \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.512718 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-config\") pod \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.512740 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-dns-svc\") pod \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\" (UID: \"ba2bd56e-0ea5-428c-b646-3396dfdb35bf\") " Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.518755 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-kube-api-access-6jzrq" (OuterVolumeSpecName: "kube-api-access-6jzrq") pod "ba2bd56e-0ea5-428c-b646-3396dfdb35bf" (UID: "ba2bd56e-0ea5-428c-b646-3396dfdb35bf"). InnerVolumeSpecName "kube-api-access-6jzrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.562242 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba2bd56e-0ea5-428c-b646-3396dfdb35bf" (UID: "ba2bd56e-0ea5-428c-b646-3396dfdb35bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.567540 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-config" (OuterVolumeSpecName: "config") pod "ba2bd56e-0ea5-428c-b646-3396dfdb35bf" (UID: "ba2bd56e-0ea5-428c-b646-3396dfdb35bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.614280 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.614379 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.614430 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jzrq\" (UniqueName: \"kubernetes.io/projected/ba2bd56e-0ea5-428c-b646-3396dfdb35bf-kube-api-access-6jzrq\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.987729 4553 generic.go:334] "Generic (PLEG): container finished" podID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" containerID="8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d" exitCode=0 Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.987789 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.987806 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" event={"ID":"ba2bd56e-0ea5-428c-b646-3396dfdb35bf","Type":"ContainerDied","Data":"8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d"} Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.990085 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-ltb2t" event={"ID":"ba2bd56e-0ea5-428c-b646-3396dfdb35bf","Type":"ContainerDied","Data":"9df2fb285341680c99799de94918f2b92fe3d7f2a842602231bed0b924eae639"} Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.990112 4553 scope.go:117] "RemoveContainer" containerID="8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d" Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.999363 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zwpmt" event={"ID":"e06ee589-214b-45a3-ab70-b71c4dfba2f9","Type":"ContainerStarted","Data":"3ed5ca8341b97d7c9debec2937d8f2a25aab3b1f696b24ff02594685fdf3daf7"} Sep 30 19:47:40 crc kubenswrapper[4553]: I0930 19:47:40.999404 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zwpmt" event={"ID":"e06ee589-214b-45a3-ab70-b71c4dfba2f9","Type":"ContainerStarted","Data":"fab3d98b149209abe45f139677ac264d233d2cc8016c39a1bc9b37db64671e82"} Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.027211 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zwpmt" podStartSLOduration=11.92898008 podStartE2EDuration="23.027192096s" podCreationTimestamp="2025-09-30 19:47:18 +0000 UTC" firstStartedPulling="2025-09-30 19:47:26.400201658 +0000 UTC m=+899.599703788" lastFinishedPulling="2025-09-30 19:47:37.498413684 +0000 UTC m=+910.697915804" observedRunningTime="2025-09-30 19:47:41.016418288 +0000 UTC m=+914.215920448" watchObservedRunningTime="2025-09-30 19:47:41.027192096 +0000 UTC m=+914.226694226" Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.050734 4553 scope.go:117] "RemoveContainer" containerID="417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202" Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.056277 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ltb2t"] Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.064859 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-ltb2t"] Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.093938 4553 scope.go:117] "RemoveContainer" containerID="8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d" Sep 30 19:47:41 crc kubenswrapper[4553]: E0930 19:47:41.094369 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d\": container with ID starting with 8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d not found: ID does not exist" containerID="8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d" Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.094402 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d"} err="failed to get container status \"8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d\": rpc error: code = NotFound desc = could not find container \"8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d\": container with ID starting with 8a980a5383641e705d326df773997e01d08bc1291bebca4ce66692db9535a73d not found: ID does not exist" Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.094423 4553 scope.go:117] "RemoveContainer" containerID="417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202" Sep 30 19:47:41 crc kubenswrapper[4553]: E0930 19:47:41.094623 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202\": container with ID starting with 417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202 not found: ID does not exist" containerID="417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202" Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.094646 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202"} err="failed to get container status \"417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202\": rpc error: code = NotFound desc = could not find container \"417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202\": container with ID starting with 417d48c5ba710251809f88c78fceb9162e61fa9d3927bdf97872ba2f07fe7202 not found: ID does not exist" Sep 30 19:47:41 crc kubenswrapper[4553]: I0930 19:47:41.512608 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" path="/var/lib/kubelet/pods/ba2bd56e-0ea5-428c-b646-3396dfdb35bf/volumes" Sep 30 19:47:42 crc kubenswrapper[4553]: I0930 19:47:42.010238 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:42 crc kubenswrapper[4553]: I0930 19:47:42.010530 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:47:43 crc kubenswrapper[4553]: I0930 19:47:43.019305 4553 generic.go:334] "Generic (PLEG): container finished" podID="d0712b30-32a7-4e50-b263-c4b3d92b6f0e" containerID="2156ee99bcc51357a22e63bee59973fa5421b4108bdf553fb7dc5f29e858dfdb" exitCode=0 Sep 30 19:47:43 crc kubenswrapper[4553]: I0930 19:47:43.020301 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0712b30-32a7-4e50-b263-c4b3d92b6f0e","Type":"ContainerDied","Data":"2156ee99bcc51357a22e63bee59973fa5421b4108bdf553fb7dc5f29e858dfdb"} Sep 30 19:47:44 crc kubenswrapper[4553]: I0930 19:47:44.036941 4553 generic.go:334] "Generic (PLEG): container finished" podID="a92d686d-50a7-44ab-80e0-5e5ee452045c" containerID="94345eb8b4ebb0c4c7bcba5291bf6e07679a0c120329eca49ed8395fd6d178f9" exitCode=0 Sep 30 19:47:44 crc kubenswrapper[4553]: I0930 19:47:44.038103 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a92d686d-50a7-44ab-80e0-5e5ee452045c","Type":"ContainerDied","Data":"94345eb8b4ebb0c4c7bcba5291bf6e07679a0c120329eca49ed8395fd6d178f9"} Sep 30 19:47:45 crc kubenswrapper[4553]: I0930 19:47:45.050723 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a92d686d-50a7-44ab-80e0-5e5ee452045c","Type":"ContainerStarted","Data":"f4e593efe47b01bf2ef6c71ee5dbfe8ea8039aa1a33336c443a231b830f569a1"} Sep 30 19:47:45 crc kubenswrapper[4553]: I0930 19:47:45.054255 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"2ace1318-99e8-4ab2-9244-ed0ca49e89d5","Type":"ContainerStarted","Data":"74367b59d89b9e3bdfa256b5aa33ae9eec01ad7bf0329dac0a4f7f74e042e7ff"} Sep 30 19:47:45 crc kubenswrapper[4553]: I0930 19:47:45.059142 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ee750363-8434-413d-9bc9-fee0218e2e1b","Type":"ContainerStarted","Data":"e7893b531952a0e5284c0551a5d660ab8b14de46b5ea864b07a3f6a0def6e4f3"} Sep 30 19:47:45 crc kubenswrapper[4553]: I0930 19:47:45.063918 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d0712b30-32a7-4e50-b263-c4b3d92b6f0e","Type":"ContainerStarted","Data":"f27fd4cd5f3cf25213b85eb9ed737bf6f7549a5c860e6e56e71395a0b36a45f9"} Sep 30 19:47:45 crc kubenswrapper[4553]: I0930 19:47:45.097920 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.471138183 podStartE2EDuration="35.097901552s" podCreationTimestamp="2025-09-30 19:47:10 +0000 UTC" firstStartedPulling="2025-09-30 19:47:23.469833961 +0000 UTC m=+896.669336091" lastFinishedPulling="2025-09-30 19:47:37.09659733 +0000 UTC m=+910.296099460" observedRunningTime="2025-09-30 19:47:45.09221874 +0000 UTC m=+918.291720880" watchObservedRunningTime="2025-09-30 19:47:45.097901552 +0000 UTC m=+918.297403692" Sep 30 19:47:45 crc kubenswrapper[4553]: I0930 19:47:45.136345 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.01140385 podStartE2EDuration="27.136319641s" podCreationTimestamp="2025-09-30 19:47:18 +0000 UTC" firstStartedPulling="2025-09-30 19:47:24.108515185 +0000 UTC m=+897.308017315" lastFinishedPulling="2025-09-30 19:47:44.233430956 +0000 UTC m=+917.432933106" observedRunningTime="2025-09-30 19:47:45.122421989 +0000 UTC m=+918.321924129" watchObservedRunningTime="2025-09-30 19:47:45.136319641 +0000 UTC m=+918.335821801" Sep 30 19:47:45 crc kubenswrapper[4553]: I0930 19:47:45.162726 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.904281983 podStartE2EDuration="35.162708987s" podCreationTimestamp="2025-09-30 19:47:10 +0000 UTC" firstStartedPulling="2025-09-30 19:47:23.469592595 +0000 UTC m=+896.669094725" lastFinishedPulling="2025-09-30 19:47:37.728019589 +0000 UTC m=+910.927521729" observedRunningTime="2025-09-30 19:47:45.158517855 +0000 UTC m=+918.358020045" watchObservedRunningTime="2025-09-30 19:47:45.162708987 +0000 UTC m=+918.362211127" Sep 30 19:47:45 crc kubenswrapper[4553]: I0930 19:47:45.856180 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:46 crc kubenswrapper[4553]: I0930 19:47:46.482303 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:46 crc kubenswrapper[4553]: I0930 19:47:46.586017 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:46 crc kubenswrapper[4553]: I0930 19:47:46.627273 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.283093737 podStartE2EDuration="27.627240393s" podCreationTimestamp="2025-09-30 19:47:19 +0000 UTC" firstStartedPulling="2025-09-30 19:47:24.863158351 +0000 UTC m=+898.062660481" lastFinishedPulling="2025-09-30 19:47:44.207305017 +0000 UTC m=+917.406807137" observedRunningTime="2025-09-30 19:47:45.197112347 +0000 UTC m=+918.396614507" watchObservedRunningTime="2025-09-30 19:47:46.627240393 +0000 UTC m=+919.826742563" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.080241 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.117505 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.378840 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-s5f7q"] Sep 30 19:47:47 crc kubenswrapper[4553]: E0930 19:47:47.379207 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" containerName="dnsmasq-dns" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.379223 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" containerName="dnsmasq-dns" Sep 30 19:47:47 crc kubenswrapper[4553]: E0930 19:47:47.379237 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" containerName="init" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.379246 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" containerName="init" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.379389 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2bd56e-0ea5-428c-b646-3396dfdb35bf" containerName="dnsmasq-dns" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.380132 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.382537 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.393079 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-s5f7q"] Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.443141 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d564t\" (UniqueName: \"kubernetes.io/projected/48aca4bb-6062-421b-b8d1-995fa384a67c-kube-api-access-d564t\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.443494 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.443539 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.443577 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-config\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.467786 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-zj5kg"] Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.468797 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.477750 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zj5kg"] Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.483273 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.512485 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.544584 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjz97\" (UniqueName: \"kubernetes.io/projected/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-kube-api-access-vjz97\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.544618 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.544652 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-ovn-rundir\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.544674 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-config\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.544873 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-combined-ca-bundle\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.544928 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-config\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.545070 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-ovs-rundir\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.545256 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d564t\" (UniqueName: \"kubernetes.io/projected/48aca4bb-6062-421b-b8d1-995fa384a67c-kube-api-access-d564t\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.545342 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.545453 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-config\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.545499 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.546255 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.546330 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.570778 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d564t\" (UniqueName: \"kubernetes.io/projected/48aca4bb-6062-421b-b8d1-995fa384a67c-kube-api-access-d564t\") pod \"dnsmasq-dns-7fd796d7df-s5f7q\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.646865 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjz97\" (UniqueName: \"kubernetes.io/projected/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-kube-api-access-vjz97\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.646903 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.646932 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-ovn-rundir\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.646968 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-combined-ca-bundle\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.646983 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-config\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.647020 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-ovs-rundir\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.647289 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-ovs-rundir\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.647839 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-config\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.647921 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-ovn-rundir\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.652273 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.659456 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-combined-ca-bundle\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.665729 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjz97\" (UniqueName: \"kubernetes.io/projected/e34b6b2d-694a-48f1-a7a1-3fd03f868af6-kube-api-access-vjz97\") pod \"ovn-controller-metrics-zj5kg\" (UID: \"e34b6b2d-694a-48f1-a7a1-3fd03f868af6\") " pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.694624 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.788080 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zj5kg" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.797678 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-s5f7q"] Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.856009 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.856684 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pd5vh"] Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.857853 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.863314 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.884829 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pd5vh"] Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.949765 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tsdf\" (UniqueName: \"kubernetes.io/projected/e8dde328-186f-4af9-b082-157666c70811-kube-api-access-4tsdf\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.949835 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.949896 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.949936 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.949995 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-config\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:47 crc kubenswrapper[4553]: I0930 19:47:47.968852 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.036899 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-s5f7q"] Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.052312 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-config\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.052471 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tsdf\" (UniqueName: \"kubernetes.io/projected/e8dde328-186f-4af9-b082-157666c70811-kube-api-access-4tsdf\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.052518 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.052790 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.052834 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.058308 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.058992 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-config\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.059197 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.059766 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.077364 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tsdf\" (UniqueName: \"kubernetes.io/projected/e8dde328-186f-4af9-b082-157666c70811-kube-api-access-4tsdf\") pod \"dnsmasq-dns-86db49b7ff-pd5vh\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.099504 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" event={"ID":"48aca4bb-6062-421b-b8d1-995fa384a67c","Type":"ContainerStarted","Data":"b9343a85de484da1623fbecf2de6c16e877afa5679913a77a5a9be10f70d7b29"} Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.154832 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.206505 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.347433 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zj5kg"] Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.433507 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.434663 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.442617 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.442771 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.442898 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-27bb7" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.443007 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.454876 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.458627 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.458674 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484ae2ff-7e70-4177-85b7-66369d8a7d76-scripts\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.458696 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484ae2ff-7e70-4177-85b7-66369d8a7d76-config\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.458731 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.458766 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548ml\" (UniqueName: \"kubernetes.io/projected/484ae2ff-7e70-4177-85b7-66369d8a7d76-kube-api-access-548ml\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.458800 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.458837 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/484ae2ff-7e70-4177-85b7-66369d8a7d76-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.541086 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pd5vh"] Sep 30 19:47:48 crc kubenswrapper[4553]: W0930 19:47:48.549701 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8dde328_186f_4af9_b082_157666c70811.slice/crio-e4d7a8b886dec9f43f32d15f8659f1cebaf3e15383a7ae72417bdb86141ead83 WatchSource:0}: Error finding container e4d7a8b886dec9f43f32d15f8659f1cebaf3e15383a7ae72417bdb86141ead83: Status 404 returned error can't find the container with id e4d7a8b886dec9f43f32d15f8659f1cebaf3e15383a7ae72417bdb86141ead83 Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.559875 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/484ae2ff-7e70-4177-85b7-66369d8a7d76-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.560345 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.560481 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484ae2ff-7e70-4177-85b7-66369d8a7d76-scripts\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.560516 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484ae2ff-7e70-4177-85b7-66369d8a7d76-config\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.560865 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.560901 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548ml\" (UniqueName: \"kubernetes.io/projected/484ae2ff-7e70-4177-85b7-66369d8a7d76-kube-api-access-548ml\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.560937 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.561195 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/484ae2ff-7e70-4177-85b7-66369d8a7d76-scripts\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.561865 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/484ae2ff-7e70-4177-85b7-66369d8a7d76-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.562682 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/484ae2ff-7e70-4177-85b7-66369d8a7d76-config\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.566686 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.568856 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.572132 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/484ae2ff-7e70-4177-85b7-66369d8a7d76-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.583678 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548ml\" (UniqueName: \"kubernetes.io/projected/484ae2ff-7e70-4177-85b7-66369d8a7d76-kube-api-access-548ml\") pod \"ovn-northd-0\" (UID: \"484ae2ff-7e70-4177-85b7-66369d8a7d76\") " pod="openstack/ovn-northd-0" Sep 30 19:47:48 crc kubenswrapper[4553]: I0930 19:47:48.775664 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.108711 4553 generic.go:334] "Generic (PLEG): container finished" podID="48aca4bb-6062-421b-b8d1-995fa384a67c" containerID="1681dda0342aba190b7923fe842098520b57007193a20f71d6756a19238e9108" exitCode=0 Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.109073 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" event={"ID":"48aca4bb-6062-421b-b8d1-995fa384a67c","Type":"ContainerDied","Data":"1681dda0342aba190b7923fe842098520b57007193a20f71d6756a19238e9108"} Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.110830 4553 generic.go:334] "Generic (PLEG): container finished" podID="e8dde328-186f-4af9-b082-157666c70811" containerID="740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb" exitCode=0 Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.111648 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" event={"ID":"e8dde328-186f-4af9-b082-157666c70811","Type":"ContainerDied","Data":"740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb"} Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.111689 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" event={"ID":"e8dde328-186f-4af9-b082-157666c70811","Type":"ContainerStarted","Data":"e4d7a8b886dec9f43f32d15f8659f1cebaf3e15383a7ae72417bdb86141ead83"} Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.126503 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zj5kg" event={"ID":"e34b6b2d-694a-48f1-a7a1-3fd03f868af6","Type":"ContainerStarted","Data":"ee704610e72347e7d18fdb8e825406fc450c0aacf6386f63d4793b94ea2a4bc9"} Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.126536 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zj5kg" event={"ID":"e34b6b2d-694a-48f1-a7a1-3fd03f868af6","Type":"ContainerStarted","Data":"acdd2d565156a3dcdac1db8ce58697d2921a0710cfcc4de60979e30b427f89d9"} Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.168355 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-zj5kg" podStartSLOduration=2.168337811 podStartE2EDuration="2.168337811s" podCreationTimestamp="2025-09-30 19:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:47:49.160534372 +0000 UTC m=+922.360036502" watchObservedRunningTime="2025-09-30 19:47:49.168337811 +0000 UTC m=+922.367839931" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.355322 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 19:47:49 crc kubenswrapper[4553]: W0930 19:47:49.384417 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod484ae2ff_7e70_4177_85b7_66369d8a7d76.slice/crio-cce7e3d808806d1f593876ea2ae36db92ea3e1f6281fd2595640f6ce35a249ef WatchSource:0}: Error finding container cce7e3d808806d1f593876ea2ae36db92ea3e1f6281fd2595640f6ce35a249ef: Status 404 returned error can't find the container with id cce7e3d808806d1f593876ea2ae36db92ea3e1f6281fd2595640f6ce35a249ef Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.644954 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.782618 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-dns-svc\") pod \"48aca4bb-6062-421b-b8d1-995fa384a67c\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.782699 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-ovsdbserver-nb\") pod \"48aca4bb-6062-421b-b8d1-995fa384a67c\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.782746 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d564t\" (UniqueName: \"kubernetes.io/projected/48aca4bb-6062-421b-b8d1-995fa384a67c-kube-api-access-d564t\") pod \"48aca4bb-6062-421b-b8d1-995fa384a67c\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.782875 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-config\") pod \"48aca4bb-6062-421b-b8d1-995fa384a67c\" (UID: \"48aca4bb-6062-421b-b8d1-995fa384a67c\") " Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.790773 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48aca4bb-6062-421b-b8d1-995fa384a67c-kube-api-access-d564t" (OuterVolumeSpecName: "kube-api-access-d564t") pod "48aca4bb-6062-421b-b8d1-995fa384a67c" (UID: "48aca4bb-6062-421b-b8d1-995fa384a67c"). InnerVolumeSpecName "kube-api-access-d564t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.804198 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "48aca4bb-6062-421b-b8d1-995fa384a67c" (UID: "48aca4bb-6062-421b-b8d1-995fa384a67c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.806300 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-config" (OuterVolumeSpecName: "config") pod "48aca4bb-6062-421b-b8d1-995fa384a67c" (UID: "48aca4bb-6062-421b-b8d1-995fa384a67c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.815461 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "48aca4bb-6062-421b-b8d1-995fa384a67c" (UID: "48aca4bb-6062-421b-b8d1-995fa384a67c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.884952 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.884986 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.884997 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/48aca4bb-6062-421b-b8d1-995fa384a67c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:49 crc kubenswrapper[4553]: I0930 19:47:49.885006 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d564t\" (UniqueName: \"kubernetes.io/projected/48aca4bb-6062-421b-b8d1-995fa384a67c-kube-api-access-d564t\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.135321 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"484ae2ff-7e70-4177-85b7-66369d8a7d76","Type":"ContainerStarted","Data":"cce7e3d808806d1f593876ea2ae36db92ea3e1f6281fd2595640f6ce35a249ef"} Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.137816 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" event={"ID":"e8dde328-186f-4af9-b082-157666c70811","Type":"ContainerStarted","Data":"5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb"} Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.139074 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.141211 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.144212 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-s5f7q" event={"ID":"48aca4bb-6062-421b-b8d1-995fa384a67c","Type":"ContainerDied","Data":"b9343a85de484da1623fbecf2de6c16e877afa5679913a77a5a9be10f70d7b29"} Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.144260 4553 scope.go:117] "RemoveContainer" containerID="1681dda0342aba190b7923fe842098520b57007193a20f71d6756a19238e9108" Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.165197 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" podStartSLOduration=3.165177779 podStartE2EDuration="3.165177779s" podCreationTimestamp="2025-09-30 19:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:47:50.160107944 +0000 UTC m=+923.359610084" watchObservedRunningTime="2025-09-30 19:47:50.165177779 +0000 UTC m=+923.364679919" Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.224597 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-s5f7q"] Sep 30 19:47:50 crc kubenswrapper[4553]: I0930 19:47:50.228694 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-s5f7q"] Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.152251 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c828a401-ebca-4e9d-850e-d6f74d380257","Type":"ContainerStarted","Data":"558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff"} Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.153702 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.158614 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"484ae2ff-7e70-4177-85b7-66369d8a7d76","Type":"ContainerStarted","Data":"574a571e2a22185ff541d8f3ffd6d0f7dcddc854ded74d18e39e2cf0c8088646"} Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.158638 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"484ae2ff-7e70-4177-85b7-66369d8a7d76","Type":"ContainerStarted","Data":"7f11807e4618d67f062802db1a227d0f678b03503e215dcfea2d144b9bf66819"} Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.158648 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.171098 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.951219344 podStartE2EDuration="38.171082921s" podCreationTimestamp="2025-09-30 19:47:13 +0000 UTC" firstStartedPulling="2025-09-30 19:47:23.675127726 +0000 UTC m=+896.874629846" lastFinishedPulling="2025-09-30 19:47:50.894991293 +0000 UTC m=+924.094493423" observedRunningTime="2025-09-30 19:47:51.165482462 +0000 UTC m=+924.364984592" watchObservedRunningTime="2025-09-30 19:47:51.171082921 +0000 UTC m=+924.370585051" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.188775 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.905386226 podStartE2EDuration="3.188761344s" podCreationTimestamp="2025-09-30 19:47:48 +0000 UTC" firstStartedPulling="2025-09-30 19:47:49.397476163 +0000 UTC m=+922.596978293" lastFinishedPulling="2025-09-30 19:47:50.680851281 +0000 UTC m=+923.880353411" observedRunningTime="2025-09-30 19:47:51.1863338 +0000 UTC m=+924.385835930" watchObservedRunningTime="2025-09-30 19:47:51.188761344 +0000 UTC m=+924.388263474" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.519680 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48aca4bb-6062-421b-b8d1-995fa384a67c" path="/var/lib/kubelet/pods/48aca4bb-6062-421b-b8d1-995fa384a67c/volumes" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.628260 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.628362 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.729327 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.995462 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:51 crc kubenswrapper[4553]: I0930 19:47:51.995541 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.167459 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r4k44" event={"ID":"9e6cc85b-124a-415e-a4f1-17219da3165c","Type":"ContainerStarted","Data":"5732b27ba1c810a15e12296124316bf29e932b6ff51a72c3c8cd669f59155358"} Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.197630 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r4k44" podStartSLOduration=6.907653863 podStartE2EDuration="34.197612105s" podCreationTimestamp="2025-09-30 19:47:18 +0000 UTC" firstStartedPulling="2025-09-30 19:47:23.70517922 +0000 UTC m=+896.904681350" lastFinishedPulling="2025-09-30 19:47:50.995137452 +0000 UTC m=+924.194639592" observedRunningTime="2025-09-30 19:47:52.192477737 +0000 UTC m=+925.391979887" watchObservedRunningTime="2025-09-30 19:47:52.197612105 +0000 UTC m=+925.397114245" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.240256 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.525274 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8vprf"] Sep 30 19:47:52 crc kubenswrapper[4553]: E0930 19:47:52.525613 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48aca4bb-6062-421b-b8d1-995fa384a67c" containerName="init" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.525625 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="48aca4bb-6062-421b-b8d1-995fa384a67c" containerName="init" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.525822 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="48aca4bb-6062-421b-b8d1-995fa384a67c" containerName="init" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.526404 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8vprf" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.536637 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8vprf"] Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.641862 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57xk2\" (UniqueName: \"kubernetes.io/projected/f1c23598-c1a7-4544-9204-f071ac589644-kube-api-access-57xk2\") pod \"placement-db-create-8vprf\" (UID: \"f1c23598-c1a7-4544-9204-f071ac589644\") " pod="openstack/placement-db-create-8vprf" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.746505 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57xk2\" (UniqueName: \"kubernetes.io/projected/f1c23598-c1a7-4544-9204-f071ac589644-kube-api-access-57xk2\") pod \"placement-db-create-8vprf\" (UID: \"f1c23598-c1a7-4544-9204-f071ac589644\") " pod="openstack/placement-db-create-8vprf" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.773733 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57xk2\" (UniqueName: \"kubernetes.io/projected/f1c23598-c1a7-4544-9204-f071ac589644-kube-api-access-57xk2\") pod \"placement-db-create-8vprf\" (UID: \"f1c23598-c1a7-4544-9204-f071ac589644\") " pod="openstack/placement-db-create-8vprf" Sep 30 19:47:52 crc kubenswrapper[4553]: I0930 19:47:52.886538 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8vprf" Sep 30 19:47:53 crc kubenswrapper[4553]: I0930 19:47:53.371260 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8vprf"] Sep 30 19:47:53 crc kubenswrapper[4553]: I0930 19:47:53.594566 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-r4k44" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.058995 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pd5vh"] Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.059517 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" podUID="e8dde328-186f-4af9-b082-157666c70811" containerName="dnsmasq-dns" containerID="cri-o://5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb" gracePeriod=10 Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.067271 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.125480 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-6mpwk"] Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.127272 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.160279 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6mpwk"] Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.198182 4553 generic.go:334] "Generic (PLEG): container finished" podID="f1c23598-c1a7-4544-9204-f071ac589644" containerID="de578ec35b37d0019c9064fa6449537af89e39d16c09cf7bcfe883868a2c834a" exitCode=0 Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.198419 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8vprf" event={"ID":"f1c23598-c1a7-4544-9204-f071ac589644","Type":"ContainerDied","Data":"de578ec35b37d0019c9064fa6449537af89e39d16c09cf7bcfe883868a2c834a"} Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.198493 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8vprf" event={"ID":"f1c23598-c1a7-4544-9204-f071ac589644","Type":"ContainerStarted","Data":"0f6bed20d18edf9aa43f481aafa6184d286e02b71f1e335bce5354670e2d98e3"} Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.275197 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.289248 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-dns-svc\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.289357 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-config\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.289387 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.289620 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ft6c\" (UniqueName: \"kubernetes.io/projected/7cde397a-0d8e-416d-8ac5-6051a5db9878-kube-api-access-6ft6c\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.289659 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.364837 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.391180 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-config\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.391234 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.391336 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ft6c\" (UniqueName: \"kubernetes.io/projected/7cde397a-0d8e-416d-8ac5-6051a5db9878-kube-api-access-6ft6c\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.391350 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.391404 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-dns-svc\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.392000 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-config\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.392442 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.392797 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.392932 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-dns-svc\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.426330 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ft6c\" (UniqueName: \"kubernetes.io/projected/7cde397a-0d8e-416d-8ac5-6051a5db9878-kube-api-access-6ft6c\") pod \"dnsmasq-dns-698758b865-6mpwk\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.444083 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.941400 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6mpwk"] Sep 30 19:47:54 crc kubenswrapper[4553]: W0930 19:47:54.946253 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cde397a_0d8e_416d_8ac5_6051a5db9878.slice/crio-44c762c2a9f38e95872a1b169753aab21b3e0f3d447efafabf65db1bc7f45fb0 WatchSource:0}: Error finding container 44c762c2a9f38e95872a1b169753aab21b3e0f3d447efafabf65db1bc7f45fb0: Status 404 returned error can't find the container with id 44c762c2a9f38e95872a1b169753aab21b3e0f3d447efafabf65db1bc7f45fb0 Sep 30 19:47:54 crc kubenswrapper[4553]: I0930 19:47:54.963471 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.102581 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-nb\") pod \"e8dde328-186f-4af9-b082-157666c70811\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.102639 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-config\") pod \"e8dde328-186f-4af9-b082-157666c70811\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.102667 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-sb\") pod \"e8dde328-186f-4af9-b082-157666c70811\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.102868 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tsdf\" (UniqueName: \"kubernetes.io/projected/e8dde328-186f-4af9-b082-157666c70811-kube-api-access-4tsdf\") pod \"e8dde328-186f-4af9-b082-157666c70811\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.102897 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-dns-svc\") pod \"e8dde328-186f-4af9-b082-157666c70811\" (UID: \"e8dde328-186f-4af9-b082-157666c70811\") " Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.109546 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8dde328-186f-4af9-b082-157666c70811-kube-api-access-4tsdf" (OuterVolumeSpecName: "kube-api-access-4tsdf") pod "e8dde328-186f-4af9-b082-157666c70811" (UID: "e8dde328-186f-4af9-b082-157666c70811"). InnerVolumeSpecName "kube-api-access-4tsdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.172444 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8dde328-186f-4af9-b082-157666c70811" (UID: "e8dde328-186f-4af9-b082-157666c70811"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.188182 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8dde328-186f-4af9-b082-157666c70811" (UID: "e8dde328-186f-4af9-b082-157666c70811"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.188656 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-config" (OuterVolumeSpecName: "config") pod "e8dde328-186f-4af9-b082-157666c70811" (UID: "e8dde328-186f-4af9-b082-157666c70811"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.195620 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8dde328-186f-4af9-b082-157666c70811" (UID: "e8dde328-186f-4af9-b082-157666c70811"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.209538 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tsdf\" (UniqueName: \"kubernetes.io/projected/e8dde328-186f-4af9-b082-157666c70811-kube-api-access-4tsdf\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.209592 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.209607 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.209617 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.209626 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8dde328-186f-4af9-b082-157666c70811-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.221880 4553 generic.go:334] "Generic (PLEG): container finished" podID="e8dde328-186f-4af9-b082-157666c70811" containerID="5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb" exitCode=0 Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.221945 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" event={"ID":"e8dde328-186f-4af9-b082-157666c70811","Type":"ContainerDied","Data":"5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb"} Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.221973 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" event={"ID":"e8dde328-186f-4af9-b082-157666c70811","Type":"ContainerDied","Data":"e4d7a8b886dec9f43f32d15f8659f1cebaf3e15383a7ae72417bdb86141ead83"} Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.221991 4553 scope.go:117] "RemoveContainer" containerID="5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.222115 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-pd5vh" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.233421 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6mpwk" event={"ID":"7cde397a-0d8e-416d-8ac5-6051a5db9878","Type":"ContainerStarted","Data":"9d36972fe3d8d64d99acbe0d9410686f467cb4167cb00a1212af00914d6680b2"} Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.233475 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6mpwk" event={"ID":"7cde397a-0d8e-416d-8ac5-6051a5db9878","Type":"ContainerStarted","Data":"44c762c2a9f38e95872a1b169753aab21b3e0f3d447efafabf65db1bc7f45fb0"} Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.266434 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 19:47:55 crc kubenswrapper[4553]: E0930 19:47:55.266825 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8dde328-186f-4af9-b082-157666c70811" containerName="dnsmasq-dns" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.266841 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8dde328-186f-4af9-b082-157666c70811" containerName="dnsmasq-dns" Sep 30 19:47:55 crc kubenswrapper[4553]: E0930 19:47:55.266869 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8dde328-186f-4af9-b082-157666c70811" containerName="init" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.266876 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8dde328-186f-4af9-b082-157666c70811" containerName="init" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.267027 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8dde328-186f-4af9-b082-157666c70811" containerName="dnsmasq-dns" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.271296 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.275535 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.275633 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jbnw5" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.275701 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.275815 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.285678 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.424153 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-cache\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.424627 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4t9f\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-kube-api-access-d4t9f\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.424681 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.424770 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-lock\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.424906 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.425528 4553 scope.go:117] "RemoveContainer" containerID="740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.447102 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pd5vh"] Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.489977 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-pd5vh"] Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.491558 4553 scope.go:117] "RemoveContainer" containerID="5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb" Sep 30 19:47:55 crc kubenswrapper[4553]: E0930 19:47:55.493424 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb\": container with ID starting with 5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb not found: ID does not exist" containerID="5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.493594 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb"} err="failed to get container status \"5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb\": rpc error: code = NotFound desc = could not find container \"5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb\": container with ID starting with 5d5edef0ba162833320a7ad8a589a6067bade9ded426b0a398a4f1d3b350aadb not found: ID does not exist" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.493677 4553 scope.go:117] "RemoveContainer" containerID="740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb" Sep 30 19:47:55 crc kubenswrapper[4553]: E0930 19:47:55.497463 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb\": container with ID starting with 740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb not found: ID does not exist" containerID="740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.497583 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb"} err="failed to get container status \"740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb\": rpc error: code = NotFound desc = could not find container \"740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb\": container with ID starting with 740d7b7e14777f85679122be45a56894432a55728617e2da3c063787342c6ceb not found: ID does not exist" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.523200 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8dde328-186f-4af9-b082-157666c70811" path="/var/lib/kubelet/pods/e8dde328-186f-4af9-b082-157666c70811/volumes" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.530242 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-lock\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.530318 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.530389 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-cache\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.530415 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4t9f\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-kube-api-access-d4t9f\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.530435 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.531603 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-lock\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.532133 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.533410 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-cache\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: E0930 19:47:55.533491 4553 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 19:47:55 crc kubenswrapper[4553]: E0930 19:47:55.533515 4553 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 19:47:55 crc kubenswrapper[4553]: E0930 19:47:55.533557 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift podName:0af05a35-cd0b-4875-b263-c8c62ebaa2cc nodeName:}" failed. No retries permitted until 2025-09-30 19:47:56.033541486 +0000 UTC m=+929.233043616 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift") pod "swift-storage-0" (UID: "0af05a35-cd0b-4875-b263-c8c62ebaa2cc") : configmap "swift-ring-files" not found Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.560249 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4t9f\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-kube-api-access-d4t9f\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.574937 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.664741 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8vprf" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.737738 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57xk2\" (UniqueName: \"kubernetes.io/projected/f1c23598-c1a7-4544-9204-f071ac589644-kube-api-access-57xk2\") pod \"f1c23598-c1a7-4544-9204-f071ac589644\" (UID: \"f1c23598-c1a7-4544-9204-f071ac589644\") " Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.741078 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c23598-c1a7-4544-9204-f071ac589644-kube-api-access-57xk2" (OuterVolumeSpecName: "kube-api-access-57xk2") pod "f1c23598-c1a7-4544-9204-f071ac589644" (UID: "f1c23598-c1a7-4544-9204-f071ac589644"). InnerVolumeSpecName "kube-api-access-57xk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:47:55 crc kubenswrapper[4553]: I0930 19:47:55.843694 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57xk2\" (UniqueName: \"kubernetes.io/projected/f1c23598-c1a7-4544-9204-f071ac589644-kube-api-access-57xk2\") on node \"crc\" DevicePath \"\"" Sep 30 19:47:56 crc kubenswrapper[4553]: I0930 19:47:56.045581 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:56 crc kubenswrapper[4553]: E0930 19:47:56.045976 4553 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 19:47:56 crc kubenswrapper[4553]: E0930 19:47:56.046001 4553 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 19:47:56 crc kubenswrapper[4553]: E0930 19:47:56.046096 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift podName:0af05a35-cd0b-4875-b263-c8c62ebaa2cc nodeName:}" failed. No retries permitted until 2025-09-30 19:47:57.046077843 +0000 UTC m=+930.245579973 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift") pod "swift-storage-0" (UID: "0af05a35-cd0b-4875-b263-c8c62ebaa2cc") : configmap "swift-ring-files" not found Sep 30 19:47:56 crc kubenswrapper[4553]: I0930 19:47:56.241666 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8vprf" event={"ID":"f1c23598-c1a7-4544-9204-f071ac589644","Type":"ContainerDied","Data":"0f6bed20d18edf9aa43f481aafa6184d286e02b71f1e335bce5354670e2d98e3"} Sep 30 19:47:56 crc kubenswrapper[4553]: I0930 19:47:56.241964 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f6bed20d18edf9aa43f481aafa6184d286e02b71f1e335bce5354670e2d98e3" Sep 30 19:47:56 crc kubenswrapper[4553]: I0930 19:47:56.241713 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8vprf" Sep 30 19:47:56 crc kubenswrapper[4553]: I0930 19:47:56.243719 4553 generic.go:334] "Generic (PLEG): container finished" podID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerID="9d36972fe3d8d64d99acbe0d9410686f467cb4167cb00a1212af00914d6680b2" exitCode=0 Sep 30 19:47:56 crc kubenswrapper[4553]: I0930 19:47:56.243853 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6mpwk" event={"ID":"7cde397a-0d8e-416d-8ac5-6051a5db9878","Type":"ContainerDied","Data":"9d36972fe3d8d64d99acbe0d9410686f467cb4167cb00a1212af00914d6680b2"} Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.059386 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:57 crc kubenswrapper[4553]: E0930 19:47:57.059638 4553 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 19:47:57 crc kubenswrapper[4553]: E0930 19:47:57.059674 4553 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 19:47:57 crc kubenswrapper[4553]: E0930 19:47:57.059795 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift podName:0af05a35-cd0b-4875-b263-c8c62ebaa2cc nodeName:}" failed. No retries permitted until 2025-09-30 19:47:59.059755013 +0000 UTC m=+932.259257153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift") pod "swift-storage-0" (UID: "0af05a35-cd0b-4875-b263-c8c62ebaa2cc") : configmap "swift-ring-files" not found Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.257141 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6mpwk" event={"ID":"7cde397a-0d8e-416d-8ac5-6051a5db9878","Type":"ContainerStarted","Data":"42d2955b10ec7574c2f79a48f67fd3defe5c6fca5fab753f3f084a8aab1730c0"} Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.258170 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.291084 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-6mpwk" podStartSLOduration=3.291020202 podStartE2EDuration="3.291020202s" podCreationTimestamp="2025-09-30 19:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:47:57.288450883 +0000 UTC m=+930.487953053" watchObservedRunningTime="2025-09-30 19:47:57.291020202 +0000 UTC m=+930.490522362" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.662799 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8qx8h"] Sep 30 19:47:57 crc kubenswrapper[4553]: E0930 19:47:57.663437 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c23598-c1a7-4544-9204-f071ac589644" containerName="mariadb-database-create" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.663541 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c23598-c1a7-4544-9204-f071ac589644" containerName="mariadb-database-create" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.663808 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c23598-c1a7-4544-9204-f071ac589644" containerName="mariadb-database-create" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.664485 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8qx8h" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.673603 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8qx8h"] Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.769346 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzxhw\" (UniqueName: \"kubernetes.io/projected/7810c768-948a-47c9-99e0-4b9c5c38f7ba-kube-api-access-tzxhw\") pod \"glance-db-create-8qx8h\" (UID: \"7810c768-948a-47c9-99e0-4b9c5c38f7ba\") " pod="openstack/glance-db-create-8qx8h" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.870776 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzxhw\" (UniqueName: \"kubernetes.io/projected/7810c768-948a-47c9-99e0-4b9c5c38f7ba-kube-api-access-tzxhw\") pod \"glance-db-create-8qx8h\" (UID: \"7810c768-948a-47c9-99e0-4b9c5c38f7ba\") " pod="openstack/glance-db-create-8qx8h" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.890711 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzxhw\" (UniqueName: \"kubernetes.io/projected/7810c768-948a-47c9-99e0-4b9c5c38f7ba-kube-api-access-tzxhw\") pod \"glance-db-create-8qx8h\" (UID: \"7810c768-948a-47c9-99e0-4b9c5c38f7ba\") " pod="openstack/glance-db-create-8qx8h" Sep 30 19:47:57 crc kubenswrapper[4553]: I0930 19:47:57.981005 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8qx8h" Sep 30 19:47:58 crc kubenswrapper[4553]: I0930 19:47:58.501200 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8qx8h"] Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.105278 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:47:59 crc kubenswrapper[4553]: E0930 19:47:59.105499 4553 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 19:47:59 crc kubenswrapper[4553]: E0930 19:47:59.106072 4553 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 19:47:59 crc kubenswrapper[4553]: E0930 19:47:59.106135 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift podName:0af05a35-cd0b-4875-b263-c8c62ebaa2cc nodeName:}" failed. No retries permitted until 2025-09-30 19:48:03.106118489 +0000 UTC m=+936.305620619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift") pod "swift-storage-0" (UID: "0af05a35-cd0b-4875-b263-c8c62ebaa2cc") : configmap "swift-ring-files" not found Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.236396 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bfvdv"] Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.237418 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.239240 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.241896 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.243092 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.262900 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bfvdv"] Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.275547 4553 generic.go:334] "Generic (PLEG): container finished" podID="7810c768-948a-47c9-99e0-4b9c5c38f7ba" containerID="d84c03382c7aa10aa510431435cb9fe64a1ccaee7681445665778d72fab29d9e" exitCode=0 Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.275640 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8qx8h" event={"ID":"7810c768-948a-47c9-99e0-4b9c5c38f7ba","Type":"ContainerDied","Data":"d84c03382c7aa10aa510431435cb9fe64a1ccaee7681445665778d72fab29d9e"} Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.276592 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8qx8h" event={"ID":"7810c768-948a-47c9-99e0-4b9c5c38f7ba","Type":"ContainerStarted","Data":"213ae37692538ff9f51bb1134e427a1a5bcf95151190d6db27cbab609092bda8"} Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.308890 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-combined-ca-bundle\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.309076 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-swiftconf\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.309127 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-scripts\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.309151 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-ring-data-devices\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.309336 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6vp\" (UniqueName: \"kubernetes.io/projected/1af5c0f2-d0c8-4b67-889b-b677e346c46c-kube-api-access-sm6vp\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.309387 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-dispersionconf\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.309430 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1af5c0f2-d0c8-4b67-889b-b677e346c46c-etc-swift\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.410400 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-swiftconf\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.410447 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-scripts\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.410469 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-ring-data-devices\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.410525 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6vp\" (UniqueName: \"kubernetes.io/projected/1af5c0f2-d0c8-4b67-889b-b677e346c46c-kube-api-access-sm6vp\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.410549 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-dispersionconf\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.410569 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1af5c0f2-d0c8-4b67-889b-b677e346c46c-etc-swift\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.410624 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-combined-ca-bundle\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.411423 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-scripts\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.411674 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1af5c0f2-d0c8-4b67-889b-b677e346c46c-etc-swift\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.411846 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-ring-data-devices\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.418619 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-dispersionconf\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.418891 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-combined-ca-bundle\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.423835 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-swiftconf\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.426194 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6vp\" (UniqueName: \"kubernetes.io/projected/1af5c0f2-d0c8-4b67-889b-b677e346c46c-kube-api-access-sm6vp\") pod \"swift-ring-rebalance-bfvdv\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.584690 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.584753 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:47:59 crc kubenswrapper[4553]: I0930 19:47:59.608885 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:48:00 crc kubenswrapper[4553]: I0930 19:48:00.052476 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bfvdv"] Sep 30 19:48:00 crc kubenswrapper[4553]: I0930 19:48:00.287114 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bfvdv" event={"ID":"1af5c0f2-d0c8-4b67-889b-b677e346c46c","Type":"ContainerStarted","Data":"60be83aa57e164fa9d7743739c922e31fad862187f13f283c59104b69a517db2"} Sep 30 19:48:00 crc kubenswrapper[4553]: I0930 19:48:00.656582 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8qx8h" Sep 30 19:48:00 crc kubenswrapper[4553]: I0930 19:48:00.734395 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzxhw\" (UniqueName: \"kubernetes.io/projected/7810c768-948a-47c9-99e0-4b9c5c38f7ba-kube-api-access-tzxhw\") pod \"7810c768-948a-47c9-99e0-4b9c5c38f7ba\" (UID: \"7810c768-948a-47c9-99e0-4b9c5c38f7ba\") " Sep 30 19:48:00 crc kubenswrapper[4553]: I0930 19:48:00.741617 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7810c768-948a-47c9-99e0-4b9c5c38f7ba-kube-api-access-tzxhw" (OuterVolumeSpecName: "kube-api-access-tzxhw") pod "7810c768-948a-47c9-99e0-4b9c5c38f7ba" (UID: "7810c768-948a-47c9-99e0-4b9c5c38f7ba"). InnerVolumeSpecName "kube-api-access-tzxhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:00 crc kubenswrapper[4553]: I0930 19:48:00.836017 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzxhw\" (UniqueName: \"kubernetes.io/projected/7810c768-948a-47c9-99e0-4b9c5c38f7ba-kube-api-access-tzxhw\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:01 crc kubenswrapper[4553]: I0930 19:48:01.300789 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8qx8h" event={"ID":"7810c768-948a-47c9-99e0-4b9c5c38f7ba","Type":"ContainerDied","Data":"213ae37692538ff9f51bb1134e427a1a5bcf95151190d6db27cbab609092bda8"} Sep 30 19:48:01 crc kubenswrapper[4553]: I0930 19:48:01.301054 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="213ae37692538ff9f51bb1134e427a1a5bcf95151190d6db27cbab609092bda8" Sep 30 19:48:01 crc kubenswrapper[4553]: I0930 19:48:01.300819 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8qx8h" Sep 30 19:48:01 crc kubenswrapper[4553]: E0930 19:48:01.364976 4553 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810c768_948a_47c9_99e0_4b9c5c38f7ba.slice\": RecentStats: unable to find data in memory cache]" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.121240 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vhp2l"] Sep 30 19:48:02 crc kubenswrapper[4553]: E0930 19:48:02.121964 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7810c768-948a-47c9-99e0-4b9c5c38f7ba" containerName="mariadb-database-create" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.121979 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7810c768-948a-47c9-99e0-4b9c5c38f7ba" containerName="mariadb-database-create" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.122225 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="7810c768-948a-47c9-99e0-4b9c5c38f7ba" containerName="mariadb-database-create" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.122888 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhp2l" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.136432 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vhp2l"] Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.164392 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px9rz\" (UniqueName: \"kubernetes.io/projected/cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8-kube-api-access-px9rz\") pod \"keystone-db-create-vhp2l\" (UID: \"cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8\") " pod="openstack/keystone-db-create-vhp2l" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.267316 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px9rz\" (UniqueName: \"kubernetes.io/projected/cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8-kube-api-access-px9rz\") pod \"keystone-db-create-vhp2l\" (UID: \"cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8\") " pod="openstack/keystone-db-create-vhp2l" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.298442 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px9rz\" (UniqueName: \"kubernetes.io/projected/cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8-kube-api-access-px9rz\") pod \"keystone-db-create-vhp2l\" (UID: \"cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8\") " pod="openstack/keystone-db-create-vhp2l" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.439867 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhp2l" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.560801 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a1f4-account-create-vqppj"] Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.561779 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a1f4-account-create-vqppj" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.564791 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.576666 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a1f4-account-create-vqppj"] Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.673099 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8phc\" (UniqueName: \"kubernetes.io/projected/5c5e64e4-7905-4524-bd33-8ab355eb2c90-kube-api-access-w8phc\") pod \"placement-a1f4-account-create-vqppj\" (UID: \"5c5e64e4-7905-4524-bd33-8ab355eb2c90\") " pod="openstack/placement-a1f4-account-create-vqppj" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.774393 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8phc\" (UniqueName: \"kubernetes.io/projected/5c5e64e4-7905-4524-bd33-8ab355eb2c90-kube-api-access-w8phc\") pod \"placement-a1f4-account-create-vqppj\" (UID: \"5c5e64e4-7905-4524-bd33-8ab355eb2c90\") " pod="openstack/placement-a1f4-account-create-vqppj" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.794506 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8phc\" (UniqueName: \"kubernetes.io/projected/5c5e64e4-7905-4524-bd33-8ab355eb2c90-kube-api-access-w8phc\") pod \"placement-a1f4-account-create-vqppj\" (UID: \"5c5e64e4-7905-4524-bd33-8ab355eb2c90\") " pod="openstack/placement-a1f4-account-create-vqppj" Sep 30 19:48:02 crc kubenswrapper[4553]: I0930 19:48:02.929999 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a1f4-account-create-vqppj" Sep 30 19:48:03 crc kubenswrapper[4553]: I0930 19:48:03.181137 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:48:03 crc kubenswrapper[4553]: E0930 19:48:03.181390 4553 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 19:48:03 crc kubenswrapper[4553]: E0930 19:48:03.181414 4553 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 19:48:03 crc kubenswrapper[4553]: E0930 19:48:03.181479 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift podName:0af05a35-cd0b-4875-b263-c8c62ebaa2cc nodeName:}" failed. No retries permitted until 2025-09-30 19:48:11.181461279 +0000 UTC m=+944.380963409 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift") pod "swift-storage-0" (UID: "0af05a35-cd0b-4875-b263-c8c62ebaa2cc") : configmap "swift-ring-files" not found Sep 30 19:48:03 crc kubenswrapper[4553]: I0930 19:48:03.871341 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 19:48:04 crc kubenswrapper[4553]: I0930 19:48:04.078530 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 19:48:04 crc kubenswrapper[4553]: I0930 19:48:04.324922 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bfvdv" event={"ID":"1af5c0f2-d0c8-4b67-889b-b677e346c46c","Type":"ContainerStarted","Data":"52ac3d0d66cbb31fe94ae20105050847c3d448aff1acc03b0f74df228aea9083"} Sep 30 19:48:04 crc kubenswrapper[4553]: I0930 19:48:04.348464 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-bfvdv" podStartSLOduration=1.495904464 podStartE2EDuration="5.348447271s" podCreationTimestamp="2025-09-30 19:47:59 +0000 UTC" firstStartedPulling="2025-09-30 19:48:00.066355837 +0000 UTC m=+933.265857987" lastFinishedPulling="2025-09-30 19:48:03.918898664 +0000 UTC m=+937.118400794" observedRunningTime="2025-09-30 19:48:04.345230645 +0000 UTC m=+937.544732775" watchObservedRunningTime="2025-09-30 19:48:04.348447271 +0000 UTC m=+937.547949401" Sep 30 19:48:04 crc kubenswrapper[4553]: I0930 19:48:04.447217 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:48:04 crc kubenswrapper[4553]: I0930 19:48:04.461437 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vhp2l"] Sep 30 19:48:04 crc kubenswrapper[4553]: W0930 19:48:04.463770 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdcfeb4c_8e6b_4854_8ec4_a82942ca83a8.slice/crio-3b0862ba9e31697549599c82647ce1a87cdb81b07a8c536bc9f00feb391caa0d WatchSource:0}: Error finding container 3b0862ba9e31697549599c82647ce1a87cdb81b07a8c536bc9f00feb391caa0d: Status 404 returned error can't find the container with id 3b0862ba9e31697549599c82647ce1a87cdb81b07a8c536bc9f00feb391caa0d Sep 30 19:48:04 crc kubenswrapper[4553]: I0930 19:48:04.582267 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7m29p"] Sep 30 19:48:04 crc kubenswrapper[4553]: I0930 19:48:04.582689 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" podUID="bd5b9d77-f719-44fa-ad65-3562931d6e37" containerName="dnsmasq-dns" containerID="cri-o://b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7" gracePeriod=10 Sep 30 19:48:04 crc kubenswrapper[4553]: I0930 19:48:04.664831 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a1f4-account-create-vqppj"] Sep 30 19:48:04 crc kubenswrapper[4553]: W0930 19:48:04.683428 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c5e64e4_7905_4524_bd33_8ab355eb2c90.slice/crio-f5b6e909f337c13620fce806a579eb9d2afdb63dcab16a6eccded36fcfcb06ba WatchSource:0}: Error finding container f5b6e909f337c13620fce806a579eb9d2afdb63dcab16a6eccded36fcfcb06ba: Status 404 returned error can't find the container with id f5b6e909f337c13620fce806a579eb9d2afdb63dcab16a6eccded36fcfcb06ba Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.189585 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.220773 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4np94\" (UniqueName: \"kubernetes.io/projected/bd5b9d77-f719-44fa-ad65-3562931d6e37-kube-api-access-4np94\") pod \"bd5b9d77-f719-44fa-ad65-3562931d6e37\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.221293 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-config\") pod \"bd5b9d77-f719-44fa-ad65-3562931d6e37\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.221384 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-dns-svc\") pod \"bd5b9d77-f719-44fa-ad65-3562931d6e37\" (UID: \"bd5b9d77-f719-44fa-ad65-3562931d6e37\") " Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.252056 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5b9d77-f719-44fa-ad65-3562931d6e37-kube-api-access-4np94" (OuterVolumeSpecName: "kube-api-access-4np94") pod "bd5b9d77-f719-44fa-ad65-3562931d6e37" (UID: "bd5b9d77-f719-44fa-ad65-3562931d6e37"). InnerVolumeSpecName "kube-api-access-4np94". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.275631 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-config" (OuterVolumeSpecName: "config") pod "bd5b9d77-f719-44fa-ad65-3562931d6e37" (UID: "bd5b9d77-f719-44fa-ad65-3562931d6e37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.302555 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd5b9d77-f719-44fa-ad65-3562931d6e37" (UID: "bd5b9d77-f719-44fa-ad65-3562931d6e37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.323432 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.323467 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd5b9d77-f719-44fa-ad65-3562931d6e37-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.323478 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4np94\" (UniqueName: \"kubernetes.io/projected/bd5b9d77-f719-44fa-ad65-3562931d6e37-kube-api-access-4np94\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.333241 4553 generic.go:334] "Generic (PLEG): container finished" podID="bd5b9d77-f719-44fa-ad65-3562931d6e37" containerID="b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7" exitCode=0 Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.333310 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" event={"ID":"bd5b9d77-f719-44fa-ad65-3562931d6e37","Type":"ContainerDied","Data":"b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7"} Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.333337 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" event={"ID":"bd5b9d77-f719-44fa-ad65-3562931d6e37","Type":"ContainerDied","Data":"e80d86ef5ba8c9dd5db695a069af40932350c70939bb91c36f4b0541d0558fe5"} Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.333352 4553 scope.go:117] "RemoveContainer" containerID="b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.335050 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-7m29p" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.335498 4553 generic.go:334] "Generic (PLEG): container finished" podID="cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8" containerID="a2314c63c3c97b40030a5904d6d4593b908a8a65f7aeda88b89ffa11914c35b3" exitCode=0 Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.335534 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhp2l" event={"ID":"cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8","Type":"ContainerDied","Data":"a2314c63c3c97b40030a5904d6d4593b908a8a65f7aeda88b89ffa11914c35b3"} Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.335549 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhp2l" event={"ID":"cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8","Type":"ContainerStarted","Data":"3b0862ba9e31697549599c82647ce1a87cdb81b07a8c536bc9f00feb391caa0d"} Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.336826 4553 generic.go:334] "Generic (PLEG): container finished" podID="5c5e64e4-7905-4524-bd33-8ab355eb2c90" containerID="20de24747c34af913a2ed98f51071a07f5cc4de34538842fce6d04a6a43c4aff" exitCode=0 Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.337528 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a1f4-account-create-vqppj" event={"ID":"5c5e64e4-7905-4524-bd33-8ab355eb2c90","Type":"ContainerDied","Data":"20de24747c34af913a2ed98f51071a07f5cc4de34538842fce6d04a6a43c4aff"} Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.337549 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a1f4-account-create-vqppj" event={"ID":"5c5e64e4-7905-4524-bd33-8ab355eb2c90","Type":"ContainerStarted","Data":"f5b6e909f337c13620fce806a579eb9d2afdb63dcab16a6eccded36fcfcb06ba"} Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.358328 4553 scope.go:117] "RemoveContainer" containerID="6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.387606 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7m29p"] Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.390312 4553 scope.go:117] "RemoveContainer" containerID="b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7" Sep 30 19:48:05 crc kubenswrapper[4553]: E0930 19:48:05.390668 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7\": container with ID starting with b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7 not found: ID does not exist" containerID="b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.390725 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7"} err="failed to get container status \"b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7\": rpc error: code = NotFound desc = could not find container \"b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7\": container with ID starting with b90be88b3a7ce03f5cca1a7803f423e779d3420f67e428ee7efa88365a4f83b7 not found: ID does not exist" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.390749 4553 scope.go:117] "RemoveContainer" containerID="6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.392746 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-7m29p"] Sep 30 19:48:05 crc kubenswrapper[4553]: E0930 19:48:05.394133 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9\": container with ID starting with 6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9 not found: ID does not exist" containerID="6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.394163 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9"} err="failed to get container status \"6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9\": rpc error: code = NotFound desc = could not find container \"6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9\": container with ID starting with 6b8dc829325ac4457071b8f7c22666ffb59cdda091f91798f557c392ebf2a6b9 not found: ID does not exist" Sep 30 19:48:05 crc kubenswrapper[4553]: I0930 19:48:05.513472 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5b9d77-f719-44fa-ad65-3562931d6e37" path="/var/lib/kubelet/pods/bd5b9d77-f719-44fa-ad65-3562931d6e37/volumes" Sep 30 19:48:06 crc kubenswrapper[4553]: I0930 19:48:06.793005 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a1f4-account-create-vqppj" Sep 30 19:48:06 crc kubenswrapper[4553]: I0930 19:48:06.802332 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhp2l" Sep 30 19:48:06 crc kubenswrapper[4553]: I0930 19:48:06.847621 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px9rz\" (UniqueName: \"kubernetes.io/projected/cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8-kube-api-access-px9rz\") pod \"cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8\" (UID: \"cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8\") " Sep 30 19:48:06 crc kubenswrapper[4553]: I0930 19:48:06.847716 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8phc\" (UniqueName: \"kubernetes.io/projected/5c5e64e4-7905-4524-bd33-8ab355eb2c90-kube-api-access-w8phc\") pod \"5c5e64e4-7905-4524-bd33-8ab355eb2c90\" (UID: \"5c5e64e4-7905-4524-bd33-8ab355eb2c90\") " Sep 30 19:48:06 crc kubenswrapper[4553]: I0930 19:48:06.854079 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8-kube-api-access-px9rz" (OuterVolumeSpecName: "kube-api-access-px9rz") pod "cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8" (UID: "cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8"). InnerVolumeSpecName "kube-api-access-px9rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:06 crc kubenswrapper[4553]: I0930 19:48:06.854197 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5e64e4-7905-4524-bd33-8ab355eb2c90-kube-api-access-w8phc" (OuterVolumeSpecName: "kube-api-access-w8phc") pod "5c5e64e4-7905-4524-bd33-8ab355eb2c90" (UID: "5c5e64e4-7905-4524-bd33-8ab355eb2c90"). InnerVolumeSpecName "kube-api-access-w8phc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:06 crc kubenswrapper[4553]: I0930 19:48:06.950502 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px9rz\" (UniqueName: \"kubernetes.io/projected/cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8-kube-api-access-px9rz\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:06 crc kubenswrapper[4553]: I0930 19:48:06.950548 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8phc\" (UniqueName: \"kubernetes.io/projected/5c5e64e4-7905-4524-bd33-8ab355eb2c90-kube-api-access-w8phc\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.360599 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhp2l" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.360606 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhp2l" event={"ID":"cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8","Type":"ContainerDied","Data":"3b0862ba9e31697549599c82647ce1a87cdb81b07a8c536bc9f00feb391caa0d"} Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.361035 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b0862ba9e31697549599c82647ce1a87cdb81b07a8c536bc9f00feb391caa0d" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.364679 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a1f4-account-create-vqppj" event={"ID":"5c5e64e4-7905-4524-bd33-8ab355eb2c90","Type":"ContainerDied","Data":"f5b6e909f337c13620fce806a579eb9d2afdb63dcab16a6eccded36fcfcb06ba"} Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.364887 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b6e909f337c13620fce806a579eb9d2afdb63dcab16a6eccded36fcfcb06ba" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.365141 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a1f4-account-create-vqppj" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.775066 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-446a-account-create-rdrh4"] Sep 30 19:48:07 crc kubenswrapper[4553]: E0930 19:48:07.775605 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5b9d77-f719-44fa-ad65-3562931d6e37" containerName="dnsmasq-dns" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.775706 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5b9d77-f719-44fa-ad65-3562931d6e37" containerName="dnsmasq-dns" Sep 30 19:48:07 crc kubenswrapper[4553]: E0930 19:48:07.775767 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5e64e4-7905-4524-bd33-8ab355eb2c90" containerName="mariadb-account-create" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.775817 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5e64e4-7905-4524-bd33-8ab355eb2c90" containerName="mariadb-account-create" Sep 30 19:48:07 crc kubenswrapper[4553]: E0930 19:48:07.775883 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5b9d77-f719-44fa-ad65-3562931d6e37" containerName="init" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.775947 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5b9d77-f719-44fa-ad65-3562931d6e37" containerName="init" Sep 30 19:48:07 crc kubenswrapper[4553]: E0930 19:48:07.776009 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8" containerName="mariadb-database-create" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.776090 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8" containerName="mariadb-database-create" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.776284 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8" containerName="mariadb-database-create" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.776347 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5b9d77-f719-44fa-ad65-3562931d6e37" containerName="dnsmasq-dns" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.776409 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c5e64e4-7905-4524-bd33-8ab355eb2c90" containerName="mariadb-account-create" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.781943 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-446a-account-create-rdrh4"] Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.782178 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-446a-account-create-rdrh4" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.786084 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.864742 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdsl\" (UniqueName: \"kubernetes.io/projected/657cedd1-5a4e-4219-977b-92da68039989-kube-api-access-ftdsl\") pod \"glance-446a-account-create-rdrh4\" (UID: \"657cedd1-5a4e-4219-977b-92da68039989\") " pod="openstack/glance-446a-account-create-rdrh4" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.966030 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdsl\" (UniqueName: \"kubernetes.io/projected/657cedd1-5a4e-4219-977b-92da68039989-kube-api-access-ftdsl\") pod \"glance-446a-account-create-rdrh4\" (UID: \"657cedd1-5a4e-4219-977b-92da68039989\") " pod="openstack/glance-446a-account-create-rdrh4" Sep 30 19:48:07 crc kubenswrapper[4553]: I0930 19:48:07.986183 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdsl\" (UniqueName: \"kubernetes.io/projected/657cedd1-5a4e-4219-977b-92da68039989-kube-api-access-ftdsl\") pod \"glance-446a-account-create-rdrh4\" (UID: \"657cedd1-5a4e-4219-977b-92da68039989\") " pod="openstack/glance-446a-account-create-rdrh4" Sep 30 19:48:08 crc kubenswrapper[4553]: I0930 19:48:08.104759 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-446a-account-create-rdrh4" Sep 30 19:48:08 crc kubenswrapper[4553]: I0930 19:48:08.401354 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-446a-account-create-rdrh4"] Sep 30 19:48:09 crc kubenswrapper[4553]: I0930 19:48:09.385088 4553 generic.go:334] "Generic (PLEG): container finished" podID="657cedd1-5a4e-4219-977b-92da68039989" containerID="d7a98675df4d5aa33e7cfcebb4518656a0a4e1ab63b3ad4455b7514f01b890bb" exitCode=0 Sep 30 19:48:09 crc kubenswrapper[4553]: I0930 19:48:09.385171 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-446a-account-create-rdrh4" event={"ID":"657cedd1-5a4e-4219-977b-92da68039989","Type":"ContainerDied","Data":"d7a98675df4d5aa33e7cfcebb4518656a0a4e1ab63b3ad4455b7514f01b890bb"} Sep 30 19:48:09 crc kubenswrapper[4553]: I0930 19:48:09.385410 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-446a-account-create-rdrh4" event={"ID":"657cedd1-5a4e-4219-977b-92da68039989","Type":"ContainerStarted","Data":"5a13c341bdf35447f958dbbfa5153a569d356744f5316ee25f2af7f54be2a2cc"} Sep 30 19:48:10 crc kubenswrapper[4553]: I0930 19:48:10.801827 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-446a-account-create-rdrh4" Sep 30 19:48:10 crc kubenswrapper[4553]: I0930 19:48:10.927282 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftdsl\" (UniqueName: \"kubernetes.io/projected/657cedd1-5a4e-4219-977b-92da68039989-kube-api-access-ftdsl\") pod \"657cedd1-5a4e-4219-977b-92da68039989\" (UID: \"657cedd1-5a4e-4219-977b-92da68039989\") " Sep 30 19:48:10 crc kubenswrapper[4553]: I0930 19:48:10.945363 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657cedd1-5a4e-4219-977b-92da68039989-kube-api-access-ftdsl" (OuterVolumeSpecName: "kube-api-access-ftdsl") pod "657cedd1-5a4e-4219-977b-92da68039989" (UID: "657cedd1-5a4e-4219-977b-92da68039989"). InnerVolumeSpecName "kube-api-access-ftdsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:11 crc kubenswrapper[4553]: I0930 19:48:11.029017 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftdsl\" (UniqueName: \"kubernetes.io/projected/657cedd1-5a4e-4219-977b-92da68039989-kube-api-access-ftdsl\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:11 crc kubenswrapper[4553]: I0930 19:48:11.231407 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:48:11 crc kubenswrapper[4553]: E0930 19:48:11.231571 4553 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 19:48:11 crc kubenswrapper[4553]: E0930 19:48:11.231596 4553 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 19:48:11 crc kubenswrapper[4553]: E0930 19:48:11.231674 4553 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift podName:0af05a35-cd0b-4875-b263-c8c62ebaa2cc nodeName:}" failed. No retries permitted until 2025-09-30 19:48:27.231655458 +0000 UTC m=+960.431157598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift") pod "swift-storage-0" (UID: "0af05a35-cd0b-4875-b263-c8c62ebaa2cc") : configmap "swift-ring-files" not found Sep 30 19:48:11 crc kubenswrapper[4553]: I0930 19:48:11.405941 4553 generic.go:334] "Generic (PLEG): container finished" podID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerID="d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85" exitCode=0 Sep 30 19:48:11 crc kubenswrapper[4553]: I0930 19:48:11.405985 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c4de23a-3df4-47a2-86f1-436a8b11c22d","Type":"ContainerDied","Data":"d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85"} Sep 30 19:48:11 crc kubenswrapper[4553]: I0930 19:48:11.408231 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-446a-account-create-rdrh4" event={"ID":"657cedd1-5a4e-4219-977b-92da68039989","Type":"ContainerDied","Data":"5a13c341bdf35447f958dbbfa5153a569d356744f5316ee25f2af7f54be2a2cc"} Sep 30 19:48:11 crc kubenswrapper[4553]: I0930 19:48:11.408295 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a13c341bdf35447f958dbbfa5153a569d356744f5316ee25f2af7f54be2a2cc" Sep 30 19:48:11 crc kubenswrapper[4553]: I0930 19:48:11.408258 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-446a-account-create-rdrh4" Sep 30 19:48:11 crc kubenswrapper[4553]: E0930 19:48:11.602954 4553 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod657cedd1_5a4e_4219_977b_92da68039989.slice/crio-5a13c341bdf35447f958dbbfa5153a569d356744f5316ee25f2af7f54be2a2cc\": RecentStats: unable to find data in memory cache]" Sep 30 19:48:12 crc kubenswrapper[4553]: I0930 19:48:12.416979 4553 generic.go:334] "Generic (PLEG): container finished" podID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerID="56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484" exitCode=0 Sep 30 19:48:12 crc kubenswrapper[4553]: I0930 19:48:12.417077 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bde6e85-a37e-4cec-a759-b0cd4eea2807","Type":"ContainerDied","Data":"56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484"} Sep 30 19:48:12 crc kubenswrapper[4553]: I0930 19:48:12.421110 4553 generic.go:334] "Generic (PLEG): container finished" podID="1af5c0f2-d0c8-4b67-889b-b677e346c46c" containerID="52ac3d0d66cbb31fe94ae20105050847c3d448aff1acc03b0f74df228aea9083" exitCode=0 Sep 30 19:48:12 crc kubenswrapper[4553]: I0930 19:48:12.421160 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bfvdv" event={"ID":"1af5c0f2-d0c8-4b67-889b-b677e346c46c","Type":"ContainerDied","Data":"52ac3d0d66cbb31fe94ae20105050847c3d448aff1acc03b0f74df228aea9083"} Sep 30 19:48:12 crc kubenswrapper[4553]: I0930 19:48:12.424225 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c4de23a-3df4-47a2-86f1-436a8b11c22d","Type":"ContainerStarted","Data":"dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42"} Sep 30 19:48:12 crc kubenswrapper[4553]: I0930 19:48:12.424491 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.084847 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.682466703 podStartE2EDuration="1m5.084827596s" podCreationTimestamp="2025-09-30 19:47:08 +0000 UTC" firstStartedPulling="2025-09-30 19:47:23.694252478 +0000 UTC m=+896.893754608" lastFinishedPulling="2025-09-30 19:47:37.096613371 +0000 UTC m=+910.296115501" observedRunningTime="2025-09-30 19:48:12.531018424 +0000 UTC m=+945.730520554" watchObservedRunningTime="2025-09-30 19:48:13.084827596 +0000 UTC m=+946.284329726" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.090757 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gw5ch"] Sep 30 19:48:13 crc kubenswrapper[4553]: E0930 19:48:13.091259 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657cedd1-5a4e-4219-977b-92da68039989" containerName="mariadb-account-create" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.091327 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="657cedd1-5a4e-4219-977b-92da68039989" containerName="mariadb-account-create" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.091559 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="657cedd1-5a4e-4219-977b-92da68039989" containerName="mariadb-account-create" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.092146 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.094122 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zv45w" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.094266 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.105148 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gw5ch"] Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.164025 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-combined-ca-bundle\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.164078 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs6qb\" (UniqueName: \"kubernetes.io/projected/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-kube-api-access-vs6qb\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.164104 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-db-sync-config-data\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.164188 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-config-data\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.266056 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-combined-ca-bundle\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.266095 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs6qb\" (UniqueName: \"kubernetes.io/projected/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-kube-api-access-vs6qb\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.266124 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-db-sync-config-data\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.266200 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-config-data\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.269599 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-combined-ca-bundle\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.274702 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-config-data\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.274759 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-db-sync-config-data\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.281604 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs6qb\" (UniqueName: \"kubernetes.io/projected/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-kube-api-access-vs6qb\") pod \"glance-db-sync-gw5ch\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.409846 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gw5ch" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.431221 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bde6e85-a37e-4cec-a759-b0cd4eea2807","Type":"ContainerStarted","Data":"41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8"} Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.432649 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.483282 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.759126625 podStartE2EDuration="1m6.483264639s" podCreationTimestamp="2025-09-30 19:47:07 +0000 UTC" firstStartedPulling="2025-09-30 19:47:23.694247308 +0000 UTC m=+896.893749438" lastFinishedPulling="2025-09-30 19:47:37.418385322 +0000 UTC m=+910.617887452" observedRunningTime="2025-09-30 19:48:13.481873622 +0000 UTC m=+946.681375752" watchObservedRunningTime="2025-09-30 19:48:13.483264639 +0000 UTC m=+946.682766769" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.686502 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.688134 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zwpmt" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.885423 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.958950 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r4k44-config-827jf"] Sep 30 19:48:13 crc kubenswrapper[4553]: E0930 19:48:13.959429 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1af5c0f2-d0c8-4b67-889b-b677e346c46c" containerName="swift-ring-rebalance" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.959442 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="1af5c0f2-d0c8-4b67-889b-b677e346c46c" containerName="swift-ring-rebalance" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.959616 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="1af5c0f2-d0c8-4b67-889b-b677e346c46c" containerName="swift-ring-rebalance" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.960129 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.962601 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.976942 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-dispersionconf\") pod \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.976986 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm6vp\" (UniqueName: \"kubernetes.io/projected/1af5c0f2-d0c8-4b67-889b-b677e346c46c-kube-api-access-sm6vp\") pod \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.977124 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1af5c0f2-d0c8-4b67-889b-b677e346c46c-etc-swift\") pod \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.977156 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-ring-data-devices\") pod \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.977175 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-combined-ca-bundle\") pod \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.977225 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-scripts\") pod \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.977245 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-swiftconf\") pod \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\" (UID: \"1af5c0f2-d0c8-4b67-889b-b677e346c46c\") " Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.979316 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r4k44-config-827jf"] Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.981846 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1af5c0f2-d0c8-4b67-889b-b677e346c46c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1af5c0f2-d0c8-4b67-889b-b677e346c46c" (UID: "1af5c0f2-d0c8-4b67-889b-b677e346c46c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:48:13 crc kubenswrapper[4553]: I0930 19:48:13.988188 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1af5c0f2-d0c8-4b67-889b-b677e346c46c" (UID: "1af5c0f2-d0c8-4b67-889b-b677e346c46c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.010862 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1af5c0f2-d0c8-4b67-889b-b677e346c46c" (UID: "1af5c0f2-d0c8-4b67-889b-b677e346c46c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.011949 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1af5c0f2-d0c8-4b67-889b-b677e346c46c-kube-api-access-sm6vp" (OuterVolumeSpecName: "kube-api-access-sm6vp") pod "1af5c0f2-d0c8-4b67-889b-b677e346c46c" (UID: "1af5c0f2-d0c8-4b67-889b-b677e346c46c"). InnerVolumeSpecName "kube-api-access-sm6vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.037239 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1af5c0f2-d0c8-4b67-889b-b677e346c46c" (UID: "1af5c0f2-d0c8-4b67-889b-b677e346c46c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.040735 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1af5c0f2-d0c8-4b67-889b-b677e346c46c" (UID: "1af5c0f2-d0c8-4b67-889b-b677e346c46c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.060871 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-scripts" (OuterVolumeSpecName: "scripts") pod "1af5c0f2-d0c8-4b67-889b-b677e346c46c" (UID: "1af5c0f2-d0c8-4b67-889b-b677e346c46c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.078569 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run-ovn\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.078637 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-log-ovn\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.078678 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmd4v\" (UniqueName: \"kubernetes.io/projected/4e698183-177c-44a0-b220-d63229674ab3-kube-api-access-gmd4v\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.078727 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.078754 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-scripts\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.078774 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-additional-scripts\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.079007 4553 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1af5c0f2-d0c8-4b67-889b-b677e346c46c-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.079032 4553 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.079055 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.079066 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1af5c0f2-d0c8-4b67-889b-b677e346c46c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.079074 4553 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.079083 4553 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1af5c0f2-d0c8-4b67-889b-b677e346c46c-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.079092 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm6vp\" (UniqueName: \"kubernetes.io/projected/1af5c0f2-d0c8-4b67-889b-b677e346c46c-kube-api-access-sm6vp\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.180919 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run-ovn\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.180992 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-log-ovn\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.181030 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmd4v\" (UniqueName: \"kubernetes.io/projected/4e698183-177c-44a0-b220-d63229674ab3-kube-api-access-gmd4v\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.181110 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.181158 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-scripts\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.181174 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-additional-scripts\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.181359 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run-ovn\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.181728 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-log-ovn\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.181777 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.181900 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-additional-scripts\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.183688 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-scripts\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: W0930 19:48:14.183691 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f9a8e95_e61a_473d_a74f_cf7a6820ff97.slice/crio-38b62e122cbca5b98fb812459e852f9d52657eb1f8d17a8e156249048fa7772f WatchSource:0}: Error finding container 38b62e122cbca5b98fb812459e852f9d52657eb1f8d17a8e156249048fa7772f: Status 404 returned error can't find the container with id 38b62e122cbca5b98fb812459e852f9d52657eb1f8d17a8e156249048fa7772f Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.184310 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gw5ch"] Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.198485 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmd4v\" (UniqueName: \"kubernetes.io/projected/4e698183-177c-44a0-b220-d63229674ab3-kube-api-access-gmd4v\") pod \"ovn-controller-r4k44-config-827jf\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.318050 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.447307 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gw5ch" event={"ID":"3f9a8e95-e61a-473d-a74f-cf7a6820ff97","Type":"ContainerStarted","Data":"38b62e122cbca5b98fb812459e852f9d52657eb1f8d17a8e156249048fa7772f"} Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.450055 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bfvdv" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.452587 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bfvdv" event={"ID":"1af5c0f2-d0c8-4b67-889b-b677e346c46c","Type":"ContainerDied","Data":"60be83aa57e164fa9d7743739c922e31fad862187f13f283c59104b69a517db2"} Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.452640 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60be83aa57e164fa9d7743739c922e31fad862187f13f283c59104b69a517db2" Sep 30 19:48:14 crc kubenswrapper[4553]: I0930 19:48:14.606824 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r4k44-config-827jf"] Sep 30 19:48:15 crc kubenswrapper[4553]: I0930 19:48:15.459344 4553 generic.go:334] "Generic (PLEG): container finished" podID="4e698183-177c-44a0-b220-d63229674ab3" containerID="27e0051eae90e7b155bbe35e06f94d7ded00aba28ff0842abbccb5bc759ecafe" exitCode=0 Sep 30 19:48:15 crc kubenswrapper[4553]: I0930 19:48:15.459530 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r4k44-config-827jf" event={"ID":"4e698183-177c-44a0-b220-d63229674ab3","Type":"ContainerDied","Data":"27e0051eae90e7b155bbe35e06f94d7ded00aba28ff0842abbccb5bc759ecafe"} Sep 30 19:48:15 crc kubenswrapper[4553]: I0930 19:48:15.459862 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r4k44-config-827jf" event={"ID":"4e698183-177c-44a0-b220-d63229674ab3","Type":"ContainerStarted","Data":"24c0e801189e4258b1a4a2c45aa400f72fc9e70dbba29f4514223c36b5b39b5d"} Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.815890 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.833866 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run\") pod \"4e698183-177c-44a0-b220-d63229674ab3\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.834125 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-additional-scripts\") pod \"4e698183-177c-44a0-b220-d63229674ab3\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.834153 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run-ovn\") pod \"4e698183-177c-44a0-b220-d63229674ab3\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.834227 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-log-ovn\") pod \"4e698183-177c-44a0-b220-d63229674ab3\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.834254 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-scripts\") pod \"4e698183-177c-44a0-b220-d63229674ab3\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.834285 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmd4v\" (UniqueName: \"kubernetes.io/projected/4e698183-177c-44a0-b220-d63229674ab3-kube-api-access-gmd4v\") pod \"4e698183-177c-44a0-b220-d63229674ab3\" (UID: \"4e698183-177c-44a0-b220-d63229674ab3\") " Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.835831 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run" (OuterVolumeSpecName: "var-run") pod "4e698183-177c-44a0-b220-d63229674ab3" (UID: "4e698183-177c-44a0-b220-d63229674ab3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.836475 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4e698183-177c-44a0-b220-d63229674ab3" (UID: "4e698183-177c-44a0-b220-d63229674ab3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.836510 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4e698183-177c-44a0-b220-d63229674ab3" (UID: "4e698183-177c-44a0-b220-d63229674ab3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.836527 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4e698183-177c-44a0-b220-d63229674ab3" (UID: "4e698183-177c-44a0-b220-d63229674ab3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.840291 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e698183-177c-44a0-b220-d63229674ab3-kube-api-access-gmd4v" (OuterVolumeSpecName: "kube-api-access-gmd4v") pod "4e698183-177c-44a0-b220-d63229674ab3" (UID: "4e698183-177c-44a0-b220-d63229674ab3"). InnerVolumeSpecName "kube-api-access-gmd4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.842976 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-scripts" (OuterVolumeSpecName: "scripts") pod "4e698183-177c-44a0-b220-d63229674ab3" (UID: "4e698183-177c-44a0-b220-d63229674ab3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.935934 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmd4v\" (UniqueName: \"kubernetes.io/projected/4e698183-177c-44a0-b220-d63229674ab3-kube-api-access-gmd4v\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.935964 4553 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.935976 4553 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.935984 4553 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.935993 4553 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e698183-177c-44a0-b220-d63229674ab3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:16 crc kubenswrapper[4553]: I0930 19:48:16.936000 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e698183-177c-44a0-b220-d63229674ab3-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:17 crc kubenswrapper[4553]: I0930 19:48:17.485427 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r4k44-config-827jf" event={"ID":"4e698183-177c-44a0-b220-d63229674ab3","Type":"ContainerDied","Data":"24c0e801189e4258b1a4a2c45aa400f72fc9e70dbba29f4514223c36b5b39b5d"} Sep 30 19:48:17 crc kubenswrapper[4553]: I0930 19:48:17.485464 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r4k44-config-827jf" Sep 30 19:48:17 crc kubenswrapper[4553]: I0930 19:48:17.485467 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24c0e801189e4258b1a4a2c45aa400f72fc9e70dbba29f4514223c36b5b39b5d" Sep 30 19:48:17 crc kubenswrapper[4553]: I0930 19:48:17.940016 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r4k44-config-827jf"] Sep 30 19:48:17 crc kubenswrapper[4553]: I0930 19:48:17.947958 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r4k44-config-827jf"] Sep 30 19:48:19 crc kubenswrapper[4553]: I0930 19:48:19.513927 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e698183-177c-44a0-b220-d63229674ab3" path="/var/lib/kubelet/pods/4e698183-177c-44a0-b220-d63229674ab3/volumes" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.255270 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e21a-account-create-tqjjr"] Sep 30 19:48:22 crc kubenswrapper[4553]: E0930 19:48:22.255793 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e698183-177c-44a0-b220-d63229674ab3" containerName="ovn-config" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.255805 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e698183-177c-44a0-b220-d63229674ab3" containerName="ovn-config" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.255964 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e698183-177c-44a0-b220-d63229674ab3" containerName="ovn-config" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.256459 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e21a-account-create-tqjjr" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.258495 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.264160 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e21a-account-create-tqjjr"] Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.322887 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c96cd\" (UniqueName: \"kubernetes.io/projected/8dac5b01-0adc-4d37-9dcb-707537a02cf0-kube-api-access-c96cd\") pod \"keystone-e21a-account-create-tqjjr\" (UID: \"8dac5b01-0adc-4d37-9dcb-707537a02cf0\") " pod="openstack/keystone-e21a-account-create-tqjjr" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.424299 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c96cd\" (UniqueName: \"kubernetes.io/projected/8dac5b01-0adc-4d37-9dcb-707537a02cf0-kube-api-access-c96cd\") pod \"keystone-e21a-account-create-tqjjr\" (UID: \"8dac5b01-0adc-4d37-9dcb-707537a02cf0\") " pod="openstack/keystone-e21a-account-create-tqjjr" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.443794 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c96cd\" (UniqueName: \"kubernetes.io/projected/8dac5b01-0adc-4d37-9dcb-707537a02cf0-kube-api-access-c96cd\") pod \"keystone-e21a-account-create-tqjjr\" (UID: \"8dac5b01-0adc-4d37-9dcb-707537a02cf0\") " pod="openstack/keystone-e21a-account-create-tqjjr" Sep 30 19:48:22 crc kubenswrapper[4553]: I0930 19:48:22.591921 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e21a-account-create-tqjjr" Sep 30 19:48:23 crc kubenswrapper[4553]: I0930 19:48:23.634279 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-r4k44" Sep 30 19:48:27 crc kubenswrapper[4553]: I0930 19:48:27.319021 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:48:27 crc kubenswrapper[4553]: I0930 19:48:27.324577 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0af05a35-cd0b-4875-b263-c8c62ebaa2cc-etc-swift\") pod \"swift-storage-0\" (UID: \"0af05a35-cd0b-4875-b263-c8c62ebaa2cc\") " pod="openstack/swift-storage-0" Sep 30 19:48:27 crc kubenswrapper[4553]: I0930 19:48:27.358228 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e21a-account-create-tqjjr"] Sep 30 19:48:27 crc kubenswrapper[4553]: W0930 19:48:27.363431 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dac5b01_0adc_4d37_9dcb_707537a02cf0.slice/crio-88394f5e13ac50a42386804a8b7c656cce261832b238690c6d42a1d37c91528a WatchSource:0}: Error finding container 88394f5e13ac50a42386804a8b7c656cce261832b238690c6d42a1d37c91528a: Status 404 returned error can't find the container with id 88394f5e13ac50a42386804a8b7c656cce261832b238690c6d42a1d37c91528a Sep 30 19:48:27 crc kubenswrapper[4553]: I0930 19:48:27.500581 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 19:48:27 crc kubenswrapper[4553]: I0930 19:48:27.583948 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e21a-account-create-tqjjr" event={"ID":"8dac5b01-0adc-4d37-9dcb-707537a02cf0","Type":"ContainerStarted","Data":"e6ff23f280f5ea964e8ec5f0752a4d1cec1ec047b059abe13c4e5dd98098ad22"} Sep 30 19:48:27 crc kubenswrapper[4553]: I0930 19:48:27.583993 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e21a-account-create-tqjjr" event={"ID":"8dac5b01-0adc-4d37-9dcb-707537a02cf0","Type":"ContainerStarted","Data":"88394f5e13ac50a42386804a8b7c656cce261832b238690c6d42a1d37c91528a"} Sep 30 19:48:28 crc kubenswrapper[4553]: I0930 19:48:28.042361 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 19:48:28 crc kubenswrapper[4553]: W0930 19:48:28.056668 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0af05a35_cd0b_4875_b263_c8c62ebaa2cc.slice/crio-fd3b067fdd105dd920bf8b8df3f313c40bb19b712adfdfb8b16fb592e8b142b4 WatchSource:0}: Error finding container fd3b067fdd105dd920bf8b8df3f313c40bb19b712adfdfb8b16fb592e8b142b4: Status 404 returned error can't find the container with id fd3b067fdd105dd920bf8b8df3f313c40bb19b712adfdfb8b16fb592e8b142b4 Sep 30 19:48:28 crc kubenswrapper[4553]: I0930 19:48:28.594253 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"fd3b067fdd105dd920bf8b8df3f313c40bb19b712adfdfb8b16fb592e8b142b4"} Sep 30 19:48:28 crc kubenswrapper[4553]: I0930 19:48:28.597305 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gw5ch" event={"ID":"3f9a8e95-e61a-473d-a74f-cf7a6820ff97","Type":"ContainerStarted","Data":"8b86fa732ebe460ff0df6d1a947d9d597f10f8fc2044fc5328f14860e3a3852c"} Sep 30 19:48:28 crc kubenswrapper[4553]: I0930 19:48:28.600098 4553 generic.go:334] "Generic (PLEG): container finished" podID="8dac5b01-0adc-4d37-9dcb-707537a02cf0" containerID="e6ff23f280f5ea964e8ec5f0752a4d1cec1ec047b059abe13c4e5dd98098ad22" exitCode=0 Sep 30 19:48:28 crc kubenswrapper[4553]: I0930 19:48:28.600178 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e21a-account-create-tqjjr" event={"ID":"8dac5b01-0adc-4d37-9dcb-707537a02cf0","Type":"ContainerDied","Data":"e6ff23f280f5ea964e8ec5f0752a4d1cec1ec047b059abe13c4e5dd98098ad22"} Sep 30 19:48:28 crc kubenswrapper[4553]: I0930 19:48:28.632327 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gw5ch" podStartSLOduration=2.7419466679999998 podStartE2EDuration="15.632297559s" podCreationTimestamp="2025-09-30 19:48:13 +0000 UTC" firstStartedPulling="2025-09-30 19:48:14.185719469 +0000 UTC m=+947.385221599" lastFinishedPulling="2025-09-30 19:48:27.07607035 +0000 UTC m=+960.275572490" observedRunningTime="2025-09-30 19:48:28.615687975 +0000 UTC m=+961.815190125" watchObservedRunningTime="2025-09-30 19:48:28.632297559 +0000 UTC m=+961.831799709" Sep 30 19:48:29 crc kubenswrapper[4553]: I0930 19:48:29.275420 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:48:29 crc kubenswrapper[4553]: I0930 19:48:29.585302 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:48:29 crc kubenswrapper[4553]: I0930 19:48:29.585682 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:48:29 crc kubenswrapper[4553]: I0930 19:48:29.585735 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:48:29 crc kubenswrapper[4553]: I0930 19:48:29.586499 4553 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c53001a48c79a1addca634bfcf9ef4be43fc5d44c498f0ba986c32047fcaed3"} pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:48:29 crc kubenswrapper[4553]: I0930 19:48:29.586568 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" containerID="cri-o://6c53001a48c79a1addca634bfcf9ef4be43fc5d44c498f0ba986c32047fcaed3" gracePeriod=600 Sep 30 19:48:29 crc kubenswrapper[4553]: I0930 19:48:29.610030 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"232412637ea620e7d65cbe2499237b368f37614f66c15ad173435a79994bc60a"} Sep 30 19:48:29 crc kubenswrapper[4553]: I0930 19:48:29.637121 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.110429 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e21a-account-create-tqjjr" Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.209304 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c96cd\" (UniqueName: \"kubernetes.io/projected/8dac5b01-0adc-4d37-9dcb-707537a02cf0-kube-api-access-c96cd\") pod \"8dac5b01-0adc-4d37-9dcb-707537a02cf0\" (UID: \"8dac5b01-0adc-4d37-9dcb-707537a02cf0\") " Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.215543 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dac5b01-0adc-4d37-9dcb-707537a02cf0-kube-api-access-c96cd" (OuterVolumeSpecName: "kube-api-access-c96cd") pod "8dac5b01-0adc-4d37-9dcb-707537a02cf0" (UID: "8dac5b01-0adc-4d37-9dcb-707537a02cf0"). InnerVolumeSpecName "kube-api-access-c96cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.311457 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c96cd\" (UniqueName: \"kubernetes.io/projected/8dac5b01-0adc-4d37-9dcb-707537a02cf0-kube-api-access-c96cd\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.630811 4553 generic.go:334] "Generic (PLEG): container finished" podID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerID="6c53001a48c79a1addca634bfcf9ef4be43fc5d44c498f0ba986c32047fcaed3" exitCode=0 Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.630921 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerDied","Data":"6c53001a48c79a1addca634bfcf9ef4be43fc5d44c498f0ba986c32047fcaed3"} Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.632403 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"a59ed9a27838f8357f3a7a080d587703e9b1aa4272b3bbad7477f76d8c23eba2"} Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.632769 4553 scope.go:117] "RemoveContainer" containerID="4d0705cac6e5b952d02766c3f1729599066280437bbe55ec8f4688736bf24a4f" Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.635179 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e21a-account-create-tqjjr" event={"ID":"8dac5b01-0adc-4d37-9dcb-707537a02cf0","Type":"ContainerDied","Data":"88394f5e13ac50a42386804a8b7c656cce261832b238690c6d42a1d37c91528a"} Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.635384 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88394f5e13ac50a42386804a8b7c656cce261832b238690c6d42a1d37c91528a" Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.635497 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e21a-account-create-tqjjr" Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.641514 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"e60279469ebae20e1596a18ce71d9509e1326e3c584be493308dfe1ac19a3fb3"} Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.641555 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"f543a05fb3a2ffcad47eefff346be875e2492cc15173485b5f002d65d1a513d3"} Sep 30 19:48:30 crc kubenswrapper[4553]: I0930 19:48:30.641564 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"b83787bc19ddddbc28a189cff8e35f7eaa0fa33a26b079bc83bab7bc469d185e"} Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.076378 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qg8lx"] Sep 30 19:48:32 crc kubenswrapper[4553]: E0930 19:48:32.077273 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dac5b01-0adc-4d37-9dcb-707537a02cf0" containerName="mariadb-account-create" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.077291 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dac5b01-0adc-4d37-9dcb-707537a02cf0" containerName="mariadb-account-create" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.077463 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dac5b01-0adc-4d37-9dcb-707537a02cf0" containerName="mariadb-account-create" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.077955 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qg8lx" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.106459 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qg8lx"] Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.149711 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj2g6\" (UniqueName: \"kubernetes.io/projected/d8f3b7b5-90a0-44bc-9ba4-40729ffe3000-kube-api-access-gj2g6\") pod \"cinder-db-create-qg8lx\" (UID: \"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000\") " pod="openstack/cinder-db-create-qg8lx" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.250669 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj2g6\" (UniqueName: \"kubernetes.io/projected/d8f3b7b5-90a0-44bc-9ba4-40729ffe3000-kube-api-access-gj2g6\") pod \"cinder-db-create-qg8lx\" (UID: \"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000\") " pod="openstack/cinder-db-create-qg8lx" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.281075 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj2g6\" (UniqueName: \"kubernetes.io/projected/d8f3b7b5-90a0-44bc-9ba4-40729ffe3000-kube-api-access-gj2g6\") pod \"cinder-db-create-qg8lx\" (UID: \"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000\") " pod="openstack/cinder-db-create-qg8lx" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.310187 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6q88m"] Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.311155 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6q88m" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.339908 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6q88m"] Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.352452 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwqjv\" (UniqueName: \"kubernetes.io/projected/d5bd8102-b39f-40ee-b03d-9912adca9e41-kube-api-access-vwqjv\") pod \"barbican-db-create-6q88m\" (UID: \"d5bd8102-b39f-40ee-b03d-9912adca9e41\") " pod="openstack/barbican-db-create-6q88m" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.403345 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qg8lx" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.453857 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwqjv\" (UniqueName: \"kubernetes.io/projected/d5bd8102-b39f-40ee-b03d-9912adca9e41-kube-api-access-vwqjv\") pod \"barbican-db-create-6q88m\" (UID: \"d5bd8102-b39f-40ee-b03d-9912adca9e41\") " pod="openstack/barbican-db-create-6q88m" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.480142 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwqjv\" (UniqueName: \"kubernetes.io/projected/d5bd8102-b39f-40ee-b03d-9912adca9e41-kube-api-access-vwqjv\") pod \"barbican-db-create-6q88m\" (UID: \"d5bd8102-b39f-40ee-b03d-9912adca9e41\") " pod="openstack/barbican-db-create-6q88m" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.555228 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-nvx5r"] Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.556207 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nvx5r" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.597566 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nvx5r"] Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.628206 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6q88m" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.656559 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26t87\" (UniqueName: \"kubernetes.io/projected/b0be7cfb-07fc-426f-a177-4199643cff46-kube-api-access-26t87\") pod \"neutron-db-create-nvx5r\" (UID: \"b0be7cfb-07fc-426f-a177-4199643cff46\") " pod="openstack/neutron-db-create-nvx5r" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.758139 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26t87\" (UniqueName: \"kubernetes.io/projected/b0be7cfb-07fc-426f-a177-4199643cff46-kube-api-access-26t87\") pod \"neutron-db-create-nvx5r\" (UID: \"b0be7cfb-07fc-426f-a177-4199643cff46\") " pod="openstack/neutron-db-create-nvx5r" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.780018 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26t87\" (UniqueName: \"kubernetes.io/projected/b0be7cfb-07fc-426f-a177-4199643cff46-kube-api-access-26t87\") pod \"neutron-db-create-nvx5r\" (UID: \"b0be7cfb-07fc-426f-a177-4199643cff46\") " pod="openstack/neutron-db-create-nvx5r" Sep 30 19:48:32 crc kubenswrapper[4553]: I0930 19:48:32.978360 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nvx5r" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.060105 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6q88m"] Sep 30 19:48:33 crc kubenswrapper[4553]: W0930 19:48:33.063015 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5bd8102_b39f_40ee_b03d_9912adca9e41.slice/crio-29dcb812cb4fc858b10a0bc5ba963621c6ce86b01c6b4a6b9facc0e4df25ec94 WatchSource:0}: Error finding container 29dcb812cb4fc858b10a0bc5ba963621c6ce86b01c6b4a6b9facc0e4df25ec94: Status 404 returned error can't find the container with id 29dcb812cb4fc858b10a0bc5ba963621c6ce86b01c6b4a6b9facc0e4df25ec94 Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.145582 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qg8lx"] Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.319904 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-hsmcl"] Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.320956 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.327171 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nvx5r"] Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.333792 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hsmcl"] Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.341455 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.341488 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.341658 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.341841 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hswpl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.407307 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8ffz\" (UniqueName: \"kubernetes.io/projected/3f158e70-9924-417e-b100-983f574bef9a-kube-api-access-j8ffz\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.407387 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-config-data\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.407454 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-combined-ca-bundle\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.508696 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-config-data\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.508853 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-combined-ca-bundle\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.508879 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8ffz\" (UniqueName: \"kubernetes.io/projected/3f158e70-9924-417e-b100-983f574bef9a-kube-api-access-j8ffz\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.516401 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-combined-ca-bundle\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.518750 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-config-data\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.536417 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8ffz\" (UniqueName: \"kubernetes.io/projected/3f158e70-9924-417e-b100-983f574bef9a-kube-api-access-j8ffz\") pod \"keystone-db-sync-hsmcl\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.694749 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6q88m" event={"ID":"d5bd8102-b39f-40ee-b03d-9912adca9e41","Type":"ContainerStarted","Data":"cd66fe117268930bf59d7a82e1ac54ea688cff6521e461a1746a34b6d2aa95ea"} Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.694793 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6q88m" event={"ID":"d5bd8102-b39f-40ee-b03d-9912adca9e41","Type":"ContainerStarted","Data":"29dcb812cb4fc858b10a0bc5ba963621c6ce86b01c6b4a6b9facc0e4df25ec94"} Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.696627 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qg8lx" event={"ID":"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000","Type":"ContainerStarted","Data":"a4ae681a0a75a849d4392af055327ef594180ee13088f923328005db87a42adf"} Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.696668 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qg8lx" event={"ID":"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000","Type":"ContainerStarted","Data":"18e231fa31af7b64b3bb35f63bca663d19b1b265bc360365bd753dd011e9e5af"} Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.700336 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"b54ecfe7601a17e92eea6451e3d79134a1dadc7308281baf13b018100a3b248c"} Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.700366 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"ea7c91586fb471be266775924b040e4640ae1eee4f44ca28c3e04c9cd93b1c9c"} Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.701853 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nvx5r" event={"ID":"b0be7cfb-07fc-426f-a177-4199643cff46","Type":"ContainerStarted","Data":"4cf710ba1ea39cbd129d7ea179d80883ef83f5c8a8bc6410b2f2b1ecb7d68c30"} Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.701893 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nvx5r" event={"ID":"b0be7cfb-07fc-426f-a177-4199643cff46","Type":"ContainerStarted","Data":"ee1c36f8ccce004f8e12468b4c6c5b2c95fafae093a687221df1e504e65b684b"} Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.719924 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-6q88m" podStartSLOduration=1.719907201 podStartE2EDuration="1.719907201s" podCreationTimestamp="2025-09-30 19:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:48:33.715438252 +0000 UTC m=+966.914940382" watchObservedRunningTime="2025-09-30 19:48:33.719907201 +0000 UTC m=+966.919409331" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.730557 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:33 crc kubenswrapper[4553]: I0930 19:48:33.731184 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-qg8lx" podStartSLOduration=1.7311663419999999 podStartE2EDuration="1.731166342s" podCreationTimestamp="2025-09-30 19:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:48:33.727578536 +0000 UTC m=+966.927080666" watchObservedRunningTime="2025-09-30 19:48:33.731166342 +0000 UTC m=+966.930668472" Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.266250 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-nvx5r" podStartSLOduration=2.266234743 podStartE2EDuration="2.266234743s" podCreationTimestamp="2025-09-30 19:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:48:33.750606743 +0000 UTC m=+966.950108873" watchObservedRunningTime="2025-09-30 19:48:34.266234743 +0000 UTC m=+967.465736873" Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.269393 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-hsmcl"] Sep 30 19:48:34 crc kubenswrapper[4553]: W0930 19:48:34.274825 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f158e70_9924_417e_b100_983f574bef9a.slice/crio-fd9194e0fd2c213c949e3aaee3601d6818243eb76d822c37f2f3e44fcce23d12 WatchSource:0}: Error finding container fd9194e0fd2c213c949e3aaee3601d6818243eb76d822c37f2f3e44fcce23d12: Status 404 returned error can't find the container with id fd9194e0fd2c213c949e3aaee3601d6818243eb76d822c37f2f3e44fcce23d12 Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.713817 4553 generic.go:334] "Generic (PLEG): container finished" podID="b0be7cfb-07fc-426f-a177-4199643cff46" containerID="4cf710ba1ea39cbd129d7ea179d80883ef83f5c8a8bc6410b2f2b1ecb7d68c30" exitCode=0 Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.713879 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nvx5r" event={"ID":"b0be7cfb-07fc-426f-a177-4199643cff46","Type":"ContainerDied","Data":"4cf710ba1ea39cbd129d7ea179d80883ef83f5c8a8bc6410b2f2b1ecb7d68c30"} Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.720643 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hsmcl" event={"ID":"3f158e70-9924-417e-b100-983f574bef9a","Type":"ContainerStarted","Data":"fd9194e0fd2c213c949e3aaee3601d6818243eb76d822c37f2f3e44fcce23d12"} Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.723108 4553 generic.go:334] "Generic (PLEG): container finished" podID="d5bd8102-b39f-40ee-b03d-9912adca9e41" containerID="cd66fe117268930bf59d7a82e1ac54ea688cff6521e461a1746a34b6d2aa95ea" exitCode=0 Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.723181 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6q88m" event={"ID":"d5bd8102-b39f-40ee-b03d-9912adca9e41","Type":"ContainerDied","Data":"cd66fe117268930bf59d7a82e1ac54ea688cff6521e461a1746a34b6d2aa95ea"} Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.728077 4553 generic.go:334] "Generic (PLEG): container finished" podID="d8f3b7b5-90a0-44bc-9ba4-40729ffe3000" containerID="a4ae681a0a75a849d4392af055327ef594180ee13088f923328005db87a42adf" exitCode=0 Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.728135 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qg8lx" event={"ID":"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000","Type":"ContainerDied","Data":"a4ae681a0a75a849d4392af055327ef594180ee13088f923328005db87a42adf"} Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.736924 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"650c0702398cee5bafb137b5aacc3e57a9c7c5c5544a5a99b0d5eb3b2336afcb"} Sep 30 19:48:34 crc kubenswrapper[4553]: I0930 19:48:34.736968 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"228446eee913e11ec5a3aaf61e5f2e1e026c24f80ab004cb209c15b8e72008e6"} Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.121325 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qg8lx" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.160540 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj2g6\" (UniqueName: \"kubernetes.io/projected/d8f3b7b5-90a0-44bc-9ba4-40729ffe3000-kube-api-access-gj2g6\") pod \"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000\" (UID: \"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000\") " Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.168080 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f3b7b5-90a0-44bc-9ba4-40729ffe3000-kube-api-access-gj2g6" (OuterVolumeSpecName: "kube-api-access-gj2g6") pod "d8f3b7b5-90a0-44bc-9ba4-40729ffe3000" (UID: "d8f3b7b5-90a0-44bc-9ba4-40729ffe3000"). InnerVolumeSpecName "kube-api-access-gj2g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.208920 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6q88m" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.223672 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nvx5r" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.262370 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwqjv\" (UniqueName: \"kubernetes.io/projected/d5bd8102-b39f-40ee-b03d-9912adca9e41-kube-api-access-vwqjv\") pod \"d5bd8102-b39f-40ee-b03d-9912adca9e41\" (UID: \"d5bd8102-b39f-40ee-b03d-9912adca9e41\") " Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.262693 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26t87\" (UniqueName: \"kubernetes.io/projected/b0be7cfb-07fc-426f-a177-4199643cff46-kube-api-access-26t87\") pod \"b0be7cfb-07fc-426f-a177-4199643cff46\" (UID: \"b0be7cfb-07fc-426f-a177-4199643cff46\") " Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.263294 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj2g6\" (UniqueName: \"kubernetes.io/projected/d8f3b7b5-90a0-44bc-9ba4-40729ffe3000-kube-api-access-gj2g6\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.265730 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0be7cfb-07fc-426f-a177-4199643cff46-kube-api-access-26t87" (OuterVolumeSpecName: "kube-api-access-26t87") pod "b0be7cfb-07fc-426f-a177-4199643cff46" (UID: "b0be7cfb-07fc-426f-a177-4199643cff46"). InnerVolumeSpecName "kube-api-access-26t87". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.268229 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bd8102-b39f-40ee-b03d-9912adca9e41-kube-api-access-vwqjv" (OuterVolumeSpecName: "kube-api-access-vwqjv") pod "d5bd8102-b39f-40ee-b03d-9912adca9e41" (UID: "d5bd8102-b39f-40ee-b03d-9912adca9e41"). InnerVolumeSpecName "kube-api-access-vwqjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.364713 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26t87\" (UniqueName: \"kubernetes.io/projected/b0be7cfb-07fc-426f-a177-4199643cff46-kube-api-access-26t87\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.364849 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwqjv\" (UniqueName: \"kubernetes.io/projected/d5bd8102-b39f-40ee-b03d-9912adca9e41-kube-api-access-vwqjv\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.767165 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"d6953ddb88ddddeedea8b091cc5fccb2ace0d94b52d99175c2767b2597a13d1f"} Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.770791 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nvx5r" event={"ID":"b0be7cfb-07fc-426f-a177-4199643cff46","Type":"ContainerDied","Data":"ee1c36f8ccce004f8e12468b4c6c5b2c95fafae093a687221df1e504e65b684b"} Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.770936 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee1c36f8ccce004f8e12468b4c6c5b2c95fafae093a687221df1e504e65b684b" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.771110 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nvx5r" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.774773 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6q88m" event={"ID":"d5bd8102-b39f-40ee-b03d-9912adca9e41","Type":"ContainerDied","Data":"29dcb812cb4fc858b10a0bc5ba963621c6ce86b01c6b4a6b9facc0e4df25ec94"} Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.774818 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29dcb812cb4fc858b10a0bc5ba963621c6ce86b01c6b4a6b9facc0e4df25ec94" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.774791 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6q88m" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.776889 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qg8lx" event={"ID":"d8f3b7b5-90a0-44bc-9ba4-40729ffe3000","Type":"ContainerDied","Data":"18e231fa31af7b64b3bb35f63bca663d19b1b265bc360365bd753dd011e9e5af"} Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.776911 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18e231fa31af7b64b3bb35f63bca663d19b1b265bc360365bd753dd011e9e5af" Sep 30 19:48:36 crc kubenswrapper[4553]: I0930 19:48:36.776953 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qg8lx" Sep 30 19:48:37 crc kubenswrapper[4553]: I0930 19:48:37.788985 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"5085c9ecaa0d05750a54aab0c82081b447406dac809ace1b9270e210a8cf9790"} Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.236309 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e6e0-account-create-g2wlx"] Sep 30 19:48:42 crc kubenswrapper[4553]: E0930 19:48:42.237302 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3b7b5-90a0-44bc-9ba4-40729ffe3000" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.237319 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3b7b5-90a0-44bc-9ba4-40729ffe3000" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: E0930 19:48:42.237347 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bd8102-b39f-40ee-b03d-9912adca9e41" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.237356 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bd8102-b39f-40ee-b03d-9912adca9e41" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: E0930 19:48:42.237375 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0be7cfb-07fc-426f-a177-4199643cff46" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.237383 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0be7cfb-07fc-426f-a177-4199643cff46" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.237623 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f3b7b5-90a0-44bc-9ba4-40729ffe3000" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.237650 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0be7cfb-07fc-426f-a177-4199643cff46" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.237671 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bd8102-b39f-40ee-b03d-9912adca9e41" containerName="mariadb-database-create" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.238328 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e6e0-account-create-g2wlx" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.241237 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.258222 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e6e0-account-create-g2wlx"] Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.261512 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vswgg\" (UniqueName: \"kubernetes.io/projected/b2f1d2b7-12bf-4d45-b80e-712e015a61e5-kube-api-access-vswgg\") pod \"barbican-e6e0-account-create-g2wlx\" (UID: \"b2f1d2b7-12bf-4d45-b80e-712e015a61e5\") " pod="openstack/barbican-e6e0-account-create-g2wlx" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.363385 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vswgg\" (UniqueName: \"kubernetes.io/projected/b2f1d2b7-12bf-4d45-b80e-712e015a61e5-kube-api-access-vswgg\") pod \"barbican-e6e0-account-create-g2wlx\" (UID: \"b2f1d2b7-12bf-4d45-b80e-712e015a61e5\") " pod="openstack/barbican-e6e0-account-create-g2wlx" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.383105 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vswgg\" (UniqueName: \"kubernetes.io/projected/b2f1d2b7-12bf-4d45-b80e-712e015a61e5-kube-api-access-vswgg\") pod \"barbican-e6e0-account-create-g2wlx\" (UID: \"b2f1d2b7-12bf-4d45-b80e-712e015a61e5\") " pod="openstack/barbican-e6e0-account-create-g2wlx" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.532310 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-981f-account-create-68lnd"] Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.533298 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-981f-account-create-68lnd" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.536085 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.540405 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-981f-account-create-68lnd"] Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.555563 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e6e0-account-create-g2wlx" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.568763 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzl4\" (UniqueName: \"kubernetes.io/projected/ccd42fc2-3f65-4d1f-a6bc-4c564f653f90-kube-api-access-4bzl4\") pod \"neutron-981f-account-create-68lnd\" (UID: \"ccd42fc2-3f65-4d1f-a6bc-4c564f653f90\") " pod="openstack/neutron-981f-account-create-68lnd" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.670073 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzl4\" (UniqueName: \"kubernetes.io/projected/ccd42fc2-3f65-4d1f-a6bc-4c564f653f90-kube-api-access-4bzl4\") pod \"neutron-981f-account-create-68lnd\" (UID: \"ccd42fc2-3f65-4d1f-a6bc-4c564f653f90\") " pod="openstack/neutron-981f-account-create-68lnd" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.733989 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzl4\" (UniqueName: \"kubernetes.io/projected/ccd42fc2-3f65-4d1f-a6bc-4c564f653f90-kube-api-access-4bzl4\") pod \"neutron-981f-account-create-68lnd\" (UID: \"ccd42fc2-3f65-4d1f-a6bc-4c564f653f90\") " pod="openstack/neutron-981f-account-create-68lnd" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.847728 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-981f-account-create-68lnd" Sep 30 19:48:42 crc kubenswrapper[4553]: I0930 19:48:42.873297 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"7ad8b1676f4e7b46d908a20dd9251429124d48b8ac99e4d5ac8bd7b31a54d547"} Sep 30 19:48:43 crc kubenswrapper[4553]: I0930 19:48:43.243695 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e6e0-account-create-g2wlx"] Sep 30 19:48:43 crc kubenswrapper[4553]: I0930 19:48:43.519745 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-981f-account-create-68lnd"] Sep 30 19:48:43 crc kubenswrapper[4553]: I0930 19:48:43.887106 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"44c37a51cfcbdf4fa8e221ad2376881a6f70096b297fce8ddba5a54e7eab0a67"} Sep 30 19:48:43 crc kubenswrapper[4553]: I0930 19:48:43.888955 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e6e0-account-create-g2wlx" event={"ID":"b2f1d2b7-12bf-4d45-b80e-712e015a61e5","Type":"ContainerStarted","Data":"46c9754aed86dab703e791c801f15b4797165dc388f19abeb54b3cec5780cd31"} Sep 30 19:48:43 crc kubenswrapper[4553]: I0930 19:48:43.888997 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e6e0-account-create-g2wlx" event={"ID":"b2f1d2b7-12bf-4d45-b80e-712e015a61e5","Type":"ContainerStarted","Data":"2e733f74092f313ef704879c4ec98f52e74dd743c03ffd28d8f7bcf888b6956f"} Sep 30 19:48:43 crc kubenswrapper[4553]: I0930 19:48:43.889953 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-981f-account-create-68lnd" event={"ID":"ccd42fc2-3f65-4d1f-a6bc-4c564f653f90","Type":"ContainerStarted","Data":"495a9682c6101b5b5b1f3b7905f8be355664c359b19b3f899b822f4a9ec419b5"} Sep 30 19:48:47 crc kubenswrapper[4553]: I0930 19:48:47.927404 4553 generic.go:334] "Generic (PLEG): container finished" podID="b2f1d2b7-12bf-4d45-b80e-712e015a61e5" containerID="46c9754aed86dab703e791c801f15b4797165dc388f19abeb54b3cec5780cd31" exitCode=0 Sep 30 19:48:47 crc kubenswrapper[4553]: I0930 19:48:47.927497 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e6e0-account-create-g2wlx" event={"ID":"b2f1d2b7-12bf-4d45-b80e-712e015a61e5","Type":"ContainerDied","Data":"46c9754aed86dab703e791c801f15b4797165dc388f19abeb54b3cec5780cd31"} Sep 30 19:48:50 crc kubenswrapper[4553]: I0930 19:48:50.972314 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"3a745085a9c79dcb7779400a509261020e997b232e1c689916556a33a0e1f7b7"} Sep 30 19:48:50 crc kubenswrapper[4553]: I0930 19:48:50.972879 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"0758951615dae4a7cd736ab8093e8ff930bce52910685c398d1e5c8f94103d45"} Sep 30 19:48:50 crc kubenswrapper[4553]: I0930 19:48:50.979070 4553 generic.go:334] "Generic (PLEG): container finished" podID="ccd42fc2-3f65-4d1f-a6bc-4c564f653f90" containerID="501b490f06d27e303698db93df1d2d24e55b73d4d54b9e859a97d85996b56eec" exitCode=0 Sep 30 19:48:50 crc kubenswrapper[4553]: I0930 19:48:50.979152 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-981f-account-create-68lnd" event={"ID":"ccd42fc2-3f65-4d1f-a6bc-4c564f653f90","Type":"ContainerDied","Data":"501b490f06d27e303698db93df1d2d24e55b73d4d54b9e859a97d85996b56eec"} Sep 30 19:48:50 crc kubenswrapper[4553]: I0930 19:48:50.982930 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hsmcl" event={"ID":"3f158e70-9924-417e-b100-983f574bef9a","Type":"ContainerStarted","Data":"e37890e7b19c63a12e765509ea26c067a0fbd538f20814008b4129de1e200df9"} Sep 30 19:48:51 crc kubenswrapper[4553]: I0930 19:48:51.051020 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-hsmcl" podStartSLOduration=1.9190284690000001 podStartE2EDuration="18.05099781s" podCreationTimestamp="2025-09-30 19:48:33 +0000 UTC" firstStartedPulling="2025-09-30 19:48:34.278151131 +0000 UTC m=+967.477653251" lastFinishedPulling="2025-09-30 19:48:50.410120452 +0000 UTC m=+983.609622592" observedRunningTime="2025-09-30 19:48:51.044759881 +0000 UTC m=+984.244262031" watchObservedRunningTime="2025-09-30 19:48:51.05099781 +0000 UTC m=+984.250499940" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.000750 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"0af05a35-cd0b-4875-b263-c8c62ebaa2cc","Type":"ContainerStarted","Data":"0aca11d96f830df57e5cc6142d6fe2ce6dda8a184e7fe8c2f4bc774685b8367f"} Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.109186 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=50.534348831 podStartE2EDuration="58.10916157s" podCreationTimestamp="2025-09-30 19:47:54 +0000 UTC" firstStartedPulling="2025-09-30 19:48:28.05881216 +0000 UTC m=+961.258314290" lastFinishedPulling="2025-09-30 19:48:35.633624899 +0000 UTC m=+968.833127029" observedRunningTime="2025-09-30 19:48:52.093523998 +0000 UTC m=+985.293026118" watchObservedRunningTime="2025-09-30 19:48:52.10916157 +0000 UTC m=+985.308663720" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.222225 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f77f-account-create-zqps4"] Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.223794 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f77f-account-create-zqps4" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.231339 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.246072 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f77f-account-create-zqps4"] Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.275761 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/ecc06475-3808-4387-8427-c82f3f77ba73-kube-api-access-fh856\") pod \"cinder-f77f-account-create-zqps4\" (UID: \"ecc06475-3808-4387-8427-c82f3f77ba73\") " pod="openstack/cinder-f77f-account-create-zqps4" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.378811 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/ecc06475-3808-4387-8427-c82f3f77ba73-kube-api-access-fh856\") pod \"cinder-f77f-account-create-zqps4\" (UID: \"ecc06475-3808-4387-8427-c82f3f77ba73\") " pod="openstack/cinder-f77f-account-create-zqps4" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.412872 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/ecc06475-3808-4387-8427-c82f3f77ba73-kube-api-access-fh856\") pod \"cinder-f77f-account-create-zqps4\" (UID: \"ecc06475-3808-4387-8427-c82f3f77ba73\") " pod="openstack/cinder-f77f-account-create-zqps4" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.453418 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tcn9h"] Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.454797 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.457099 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.516198 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e6e0-account-create-g2wlx" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.523438 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tcn9h"] Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.523620 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-981f-account-create-68lnd" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.568158 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f77f-account-create-zqps4" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.582320 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bzl4\" (UniqueName: \"kubernetes.io/projected/ccd42fc2-3f65-4d1f-a6bc-4c564f653f90-kube-api-access-4bzl4\") pod \"ccd42fc2-3f65-4d1f-a6bc-4c564f653f90\" (UID: \"ccd42fc2-3f65-4d1f-a6bc-4c564f653f90\") " Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.582690 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vswgg\" (UniqueName: \"kubernetes.io/projected/b2f1d2b7-12bf-4d45-b80e-712e015a61e5-kube-api-access-vswgg\") pod \"b2f1d2b7-12bf-4d45-b80e-712e015a61e5\" (UID: \"b2f1d2b7-12bf-4d45-b80e-712e015a61e5\") " Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.582884 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.582957 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.582979 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xg4p\" (UniqueName: \"kubernetes.io/projected/5aef0df7-8283-429d-a90f-756a236e04c2-kube-api-access-8xg4p\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.583009 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-config\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.583052 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.583121 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.591844 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f1d2b7-12bf-4d45-b80e-712e015a61e5-kube-api-access-vswgg" (OuterVolumeSpecName: "kube-api-access-vswgg") pod "b2f1d2b7-12bf-4d45-b80e-712e015a61e5" (UID: "b2f1d2b7-12bf-4d45-b80e-712e015a61e5"). InnerVolumeSpecName "kube-api-access-vswgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.602228 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd42fc2-3f65-4d1f-a6bc-4c564f653f90-kube-api-access-4bzl4" (OuterVolumeSpecName: "kube-api-access-4bzl4") pod "ccd42fc2-3f65-4d1f-a6bc-4c564f653f90" (UID: "ccd42fc2-3f65-4d1f-a6bc-4c564f653f90"). InnerVolumeSpecName "kube-api-access-4bzl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.690755 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.690897 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.690950 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.691001 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.691541 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xg4p\" (UniqueName: \"kubernetes.io/projected/5aef0df7-8283-429d-a90f-756a236e04c2-kube-api-access-8xg4p\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.691649 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-config\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.691832 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bzl4\" (UniqueName: \"kubernetes.io/projected/ccd42fc2-3f65-4d1f-a6bc-4c564f653f90-kube-api-access-4bzl4\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.691848 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vswgg\" (UniqueName: \"kubernetes.io/projected/b2f1d2b7-12bf-4d45-b80e-712e015a61e5-kube-api-access-vswgg\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.692446 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.692708 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.692730 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-config\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.692975 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.693837 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.708930 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xg4p\" (UniqueName: \"kubernetes.io/projected/5aef0df7-8283-429d-a90f-756a236e04c2-kube-api-access-8xg4p\") pod \"dnsmasq-dns-764c5664d7-tcn9h\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:52 crc kubenswrapper[4553]: I0930 19:48:52.837271 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:53 crc kubenswrapper[4553]: I0930 19:48:53.007759 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-981f-account-create-68lnd" event={"ID":"ccd42fc2-3f65-4d1f-a6bc-4c564f653f90","Type":"ContainerDied","Data":"495a9682c6101b5b5b1f3b7905f8be355664c359b19b3f899b822f4a9ec419b5"} Sep 30 19:48:53 crc kubenswrapper[4553]: I0930 19:48:53.007805 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="495a9682c6101b5b5b1f3b7905f8be355664c359b19b3f899b822f4a9ec419b5" Sep 30 19:48:53 crc kubenswrapper[4553]: I0930 19:48:53.007771 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-981f-account-create-68lnd" Sep 30 19:48:53 crc kubenswrapper[4553]: I0930 19:48:53.008946 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e6e0-account-create-g2wlx" Sep 30 19:48:53 crc kubenswrapper[4553]: I0930 19:48:53.008958 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e6e0-account-create-g2wlx" event={"ID":"b2f1d2b7-12bf-4d45-b80e-712e015a61e5","Type":"ContainerDied","Data":"2e733f74092f313ef704879c4ec98f52e74dd743c03ffd28d8f7bcf888b6956f"} Sep 30 19:48:53 crc kubenswrapper[4553]: I0930 19:48:53.009016 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e733f74092f313ef704879c4ec98f52e74dd743c03ffd28d8f7bcf888b6956f" Sep 30 19:48:53 crc kubenswrapper[4553]: I0930 19:48:53.042904 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f77f-account-create-zqps4"] Sep 30 19:48:53 crc kubenswrapper[4553]: W0930 19:48:53.049685 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecc06475_3808_4387_8427_c82f3f77ba73.slice/crio-8e1c3c7a87bb4b5b7cc3304d438291cb17f3e97f2b4245e784cd79970b55ab05 WatchSource:0}: Error finding container 8e1c3c7a87bb4b5b7cc3304d438291cb17f3e97f2b4245e784cd79970b55ab05: Status 404 returned error can't find the container with id 8e1c3c7a87bb4b5b7cc3304d438291cb17f3e97f2b4245e784cd79970b55ab05 Sep 30 19:48:54 crc kubenswrapper[4553]: I0930 19:48:54.021180 4553 generic.go:334] "Generic (PLEG): container finished" podID="ecc06475-3808-4387-8427-c82f3f77ba73" containerID="7afad2e950debbaf9e8d1794c09604135264a240fed5f9dbf01384cea93133a5" exitCode=0 Sep 30 19:48:54 crc kubenswrapper[4553]: I0930 19:48:54.021488 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f77f-account-create-zqps4" event={"ID":"ecc06475-3808-4387-8427-c82f3f77ba73","Type":"ContainerDied","Data":"7afad2e950debbaf9e8d1794c09604135264a240fed5f9dbf01384cea93133a5"} Sep 30 19:48:54 crc kubenswrapper[4553]: I0930 19:48:54.021516 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f77f-account-create-zqps4" event={"ID":"ecc06475-3808-4387-8427-c82f3f77ba73","Type":"ContainerStarted","Data":"8e1c3c7a87bb4b5b7cc3304d438291cb17f3e97f2b4245e784cd79970b55ab05"} Sep 30 19:48:54 crc kubenswrapper[4553]: W0930 19:48:54.138741 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aef0df7_8283_429d_a90f_756a236e04c2.slice/crio-9b6ec96294ec8029cf708468372a7c49942d041bdd942a5550c3993d34965f1e WatchSource:0}: Error finding container 9b6ec96294ec8029cf708468372a7c49942d041bdd942a5550c3993d34965f1e: Status 404 returned error can't find the container with id 9b6ec96294ec8029cf708468372a7c49942d041bdd942a5550c3993d34965f1e Sep 30 19:48:54 crc kubenswrapper[4553]: I0930 19:48:54.139164 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tcn9h"] Sep 30 19:48:55 crc kubenswrapper[4553]: I0930 19:48:55.030101 4553 generic.go:334] "Generic (PLEG): container finished" podID="5aef0df7-8283-429d-a90f-756a236e04c2" containerID="328366c9eace34d1fdfa6494fc2fe530c5de569d394341b79e464a332347841d" exitCode=0 Sep 30 19:48:55 crc kubenswrapper[4553]: I0930 19:48:55.030205 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" event={"ID":"5aef0df7-8283-429d-a90f-756a236e04c2","Type":"ContainerDied","Data":"328366c9eace34d1fdfa6494fc2fe530c5de569d394341b79e464a332347841d"} Sep 30 19:48:55 crc kubenswrapper[4553]: I0930 19:48:55.030390 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" event={"ID":"5aef0df7-8283-429d-a90f-756a236e04c2","Type":"ContainerStarted","Data":"9b6ec96294ec8029cf708468372a7c49942d041bdd942a5550c3993d34965f1e"} Sep 30 19:48:55 crc kubenswrapper[4553]: I0930 19:48:55.350830 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f77f-account-create-zqps4" Sep 30 19:48:55 crc kubenswrapper[4553]: I0930 19:48:55.454715 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/ecc06475-3808-4387-8427-c82f3f77ba73-kube-api-access-fh856\") pod \"ecc06475-3808-4387-8427-c82f3f77ba73\" (UID: \"ecc06475-3808-4387-8427-c82f3f77ba73\") " Sep 30 19:48:55 crc kubenswrapper[4553]: I0930 19:48:55.467431 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc06475-3808-4387-8427-c82f3f77ba73-kube-api-access-fh856" (OuterVolumeSpecName: "kube-api-access-fh856") pod "ecc06475-3808-4387-8427-c82f3f77ba73" (UID: "ecc06475-3808-4387-8427-c82f3f77ba73"). InnerVolumeSpecName "kube-api-access-fh856". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:55 crc kubenswrapper[4553]: I0930 19:48:55.557919 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/ecc06475-3808-4387-8427-c82f3f77ba73-kube-api-access-fh856\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:56 crc kubenswrapper[4553]: I0930 19:48:56.056871 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f77f-account-create-zqps4" event={"ID":"ecc06475-3808-4387-8427-c82f3f77ba73","Type":"ContainerDied","Data":"8e1c3c7a87bb4b5b7cc3304d438291cb17f3e97f2b4245e784cd79970b55ab05"} Sep 30 19:48:56 crc kubenswrapper[4553]: I0930 19:48:56.056913 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e1c3c7a87bb4b5b7cc3304d438291cb17f3e97f2b4245e784cd79970b55ab05" Sep 30 19:48:56 crc kubenswrapper[4553]: I0930 19:48:56.058545 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f77f-account-create-zqps4" Sep 30 19:48:56 crc kubenswrapper[4553]: I0930 19:48:56.058983 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" event={"ID":"5aef0df7-8283-429d-a90f-756a236e04c2","Type":"ContainerStarted","Data":"6bb2dfce4f24deda5ffb1bdccf1e60fd83dcb0a3ef1e3ae2918d4162f09d473b"} Sep 30 19:48:56 crc kubenswrapper[4553]: I0930 19:48:56.059854 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:48:56 crc kubenswrapper[4553]: I0930 19:48:56.061724 4553 generic.go:334] "Generic (PLEG): container finished" podID="3f158e70-9924-417e-b100-983f574bef9a" containerID="e37890e7b19c63a12e765509ea26c067a0fbd538f20814008b4129de1e200df9" exitCode=0 Sep 30 19:48:56 crc kubenswrapper[4553]: I0930 19:48:56.061750 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hsmcl" event={"ID":"3f158e70-9924-417e-b100-983f574bef9a","Type":"ContainerDied","Data":"e37890e7b19c63a12e765509ea26c067a0fbd538f20814008b4129de1e200df9"} Sep 30 19:48:56 crc kubenswrapper[4553]: I0930 19:48:56.098207 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" podStartSLOduration=4.098185317 podStartE2EDuration="4.098185317s" podCreationTimestamp="2025-09-30 19:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:48:56.082952816 +0000 UTC m=+989.282454946" watchObservedRunningTime="2025-09-30 19:48:56.098185317 +0000 UTC m=+989.297687447" Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.432528 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.493924 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-combined-ca-bundle\") pod \"3f158e70-9924-417e-b100-983f574bef9a\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.494238 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8ffz\" (UniqueName: \"kubernetes.io/projected/3f158e70-9924-417e-b100-983f574bef9a-kube-api-access-j8ffz\") pod \"3f158e70-9924-417e-b100-983f574bef9a\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.494346 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-config-data\") pod \"3f158e70-9924-417e-b100-983f574bef9a\" (UID: \"3f158e70-9924-417e-b100-983f574bef9a\") " Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.498193 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f158e70-9924-417e-b100-983f574bef9a-kube-api-access-j8ffz" (OuterVolumeSpecName: "kube-api-access-j8ffz") pod "3f158e70-9924-417e-b100-983f574bef9a" (UID: "3f158e70-9924-417e-b100-983f574bef9a"). InnerVolumeSpecName "kube-api-access-j8ffz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.520548 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f158e70-9924-417e-b100-983f574bef9a" (UID: "3f158e70-9924-417e-b100-983f574bef9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.561837 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-config-data" (OuterVolumeSpecName: "config-data") pod "3f158e70-9924-417e-b100-983f574bef9a" (UID: "3f158e70-9924-417e-b100-983f574bef9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.596062 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.596338 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8ffz\" (UniqueName: \"kubernetes.io/projected/3f158e70-9924-417e-b100-983f574bef9a-kube-api-access-j8ffz\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:57 crc kubenswrapper[4553]: I0930 19:48:57.596348 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f158e70-9924-417e-b100-983f574bef9a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.106145 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-hsmcl" event={"ID":"3f158e70-9924-417e-b100-983f574bef9a","Type":"ContainerDied","Data":"fd9194e0fd2c213c949e3aaee3601d6818243eb76d822c37f2f3e44fcce23d12"} Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.106376 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd9194e0fd2c213c949e3aaee3601d6818243eb76d822c37f2f3e44fcce23d12" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.107294 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-hsmcl" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.380210 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tcn9h"] Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.380415 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" podUID="5aef0df7-8283-429d-a90f-756a236e04c2" containerName="dnsmasq-dns" containerID="cri-o://6bb2dfce4f24deda5ffb1bdccf1e60fd83dcb0a3ef1e3ae2918d4162f09d473b" gracePeriod=10 Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455172 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-df4gj"] Sep 30 19:48:58 crc kubenswrapper[4553]: E0930 19:48:58.455511 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc06475-3808-4387-8427-c82f3f77ba73" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455523 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc06475-3808-4387-8427-c82f3f77ba73" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: E0930 19:48:58.455532 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd42fc2-3f65-4d1f-a6bc-4c564f653f90" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455538 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd42fc2-3f65-4d1f-a6bc-4c564f653f90" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: E0930 19:48:58.455552 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f158e70-9924-417e-b100-983f574bef9a" containerName="keystone-db-sync" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455559 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f158e70-9924-417e-b100-983f574bef9a" containerName="keystone-db-sync" Sep 30 19:48:58 crc kubenswrapper[4553]: E0930 19:48:58.455567 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f1d2b7-12bf-4d45-b80e-712e015a61e5" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455572 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f1d2b7-12bf-4d45-b80e-712e015a61e5" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455727 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f1d2b7-12bf-4d45-b80e-712e015a61e5" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455742 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f158e70-9924-417e-b100-983f574bef9a" containerName="keystone-db-sync" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455756 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd42fc2-3f65-4d1f-a6bc-4c564f653f90" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.455767 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc06475-3808-4387-8427-c82f3f77ba73" containerName="mariadb-account-create" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.456285 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.490974 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.491875 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.492746 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hswpl" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.493097 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.522852 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-fernet-keys\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.522998 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwst4\" (UniqueName: \"kubernetes.io/projected/5eadc17c-def7-44ac-bafe-23adea8e696a-kube-api-access-qwst4\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.523050 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-credential-keys\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.523070 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-combined-ca-bundle\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.523260 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-scripts\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.523307 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-config-data\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.554121 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-4lhbd"] Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.555633 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.575335 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-df4gj"] Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625687 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-scripts\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625753 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-config-data\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625793 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-fernet-keys\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625831 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625861 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625908 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625931 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zss5d\" (UniqueName: \"kubernetes.io/projected/5b0ff91e-d7de-4edf-9206-197dea687f2f-kube-api-access-zss5d\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625962 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwst4\" (UniqueName: \"kubernetes.io/projected/5eadc17c-def7-44ac-bafe-23adea8e696a-kube-api-access-qwst4\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.625985 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-config\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.626014 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-credential-keys\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.626048 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-combined-ca-bundle\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.626097 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-svc\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.630470 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-fernet-keys\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.635494 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-scripts\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.636291 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-credential-keys\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.644256 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-combined-ca-bundle\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.652604 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-config-data\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.700117 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-4lhbd"] Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.737883 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.737935 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.737969 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.737989 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zss5d\" (UniqueName: \"kubernetes.io/projected/5b0ff91e-d7de-4edf-9206-197dea687f2f-kube-api-access-zss5d\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.738014 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-config\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.738076 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-svc\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.738880 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-svc\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.739390 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.739872 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.740475 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-config\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.740495 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.802742 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zss5d\" (UniqueName: \"kubernetes.io/projected/5b0ff91e-d7de-4edf-9206-197dea687f2f-kube-api-access-zss5d\") pod \"dnsmasq-dns-5959f8865f-4lhbd\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.827614 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwst4\" (UniqueName: \"kubernetes.io/projected/5eadc17c-def7-44ac-bafe-23adea8e696a-kube-api-access-qwst4\") pod \"keystone-bootstrap-df4gj\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:58 crc kubenswrapper[4553]: I0930 19:48:58.882030 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.021648 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f698cb877-vqsz6"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.023164 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.071408 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.071743 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-f7rgm"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.082856 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f698cb877-vqsz6"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.082957 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.087899 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.087944 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.088150 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sl72r" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.088301 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.093924 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-srk6z" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.094109 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.094583 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.115107 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-f7rgm"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.120901 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-prf67"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.130572 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.149481 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-prf67"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.160235 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.160379 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.160314 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tvsgj" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162226 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg5sr\" (UniqueName: \"kubernetes.io/projected/d1bf2fc0-8737-4258-9bf8-1978001043f9-kube-api-access-dg5sr\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162268 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95fc6b11-e9bf-4886-9036-8276f127b8bf-horizon-secret-key\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162290 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-db-sync-config-data\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162313 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fc6b11-e9bf-4886-9036-8276f127b8bf-logs\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162346 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-scripts\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162363 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-combined-ca-bundle\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162761 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04f1abd5-5975-4038-98b3-4b6ff0e858f7-etc-machine-id\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162887 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-scripts\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.162967 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-config-data\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.163097 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdh9d\" (UniqueName: \"kubernetes.io/projected/04f1abd5-5975-4038-98b3-4b6ff0e858f7-kube-api-access-zdh9d\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.163180 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkq4h\" (UniqueName: \"kubernetes.io/projected/95fc6b11-e9bf-4886-9036-8276f127b8bf-kube-api-access-gkq4h\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.163252 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-combined-ca-bundle\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.163271 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-config\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.163323 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-config-data\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.198765 4553 generic.go:334] "Generic (PLEG): container finished" podID="5aef0df7-8283-429d-a90f-756a236e04c2" containerID="6bb2dfce4f24deda5ffb1bdccf1e60fd83dcb0a3ef1e3ae2918d4162f09d473b" exitCode=0 Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.198808 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" event={"ID":"5aef0df7-8283-429d-a90f-756a236e04c2","Type":"ContainerDied","Data":"6bb2dfce4f24deda5ffb1bdccf1e60fd83dcb0a3ef1e3ae2918d4162f09d473b"} Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264495 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95fc6b11-e9bf-4886-9036-8276f127b8bf-horizon-secret-key\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264534 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-db-sync-config-data\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264561 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fc6b11-e9bf-4886-9036-8276f127b8bf-logs\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264592 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-scripts\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264611 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-combined-ca-bundle\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264635 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04f1abd5-5975-4038-98b3-4b6ff0e858f7-etc-machine-id\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264655 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-scripts\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264674 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-config-data\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264720 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdh9d\" (UniqueName: \"kubernetes.io/projected/04f1abd5-5975-4038-98b3-4b6ff0e858f7-kube-api-access-zdh9d\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264743 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkq4h\" (UniqueName: \"kubernetes.io/projected/95fc6b11-e9bf-4886-9036-8276f127b8bf-kube-api-access-gkq4h\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264765 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-combined-ca-bundle\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264782 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-config\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264796 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-config-data\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.264823 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5sr\" (UniqueName: \"kubernetes.io/projected/d1bf2fc0-8737-4258-9bf8-1978001043f9-kube-api-access-dg5sr\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.267678 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-scripts\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.274922 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95fc6b11-e9bf-4886-9036-8276f127b8bf-horizon-secret-key\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.274934 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fc6b11-e9bf-4886-9036-8276f127b8bf-logs\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.275808 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-config-data\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.275845 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-config\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.275855 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.276761 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04f1abd5-5975-4038-98b3-4b6ff0e858f7-etc-machine-id\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.278608 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.279931 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-db-sync-config-data\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.286769 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-combined-ca-bundle\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.286932 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-config-data\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.287116 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-combined-ca-bundle\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.287506 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-scripts\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.290718 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.300367 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.300877 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.308535 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zsslz"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.309571 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.311660 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdh9d\" (UniqueName: \"kubernetes.io/projected/04f1abd5-5975-4038-98b3-4b6ff0e858f7-kube-api-access-zdh9d\") pod \"cinder-db-sync-prf67\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.334971 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg5sr\" (UniqueName: \"kubernetes.io/projected/d1bf2fc0-8737-4258-9bf8-1978001043f9-kube-api-access-dg5sr\") pod \"neutron-db-sync-f7rgm\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.335004 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkq4h\" (UniqueName: \"kubernetes.io/projected/95fc6b11-e9bf-4886-9036-8276f127b8bf-kube-api-access-gkq4h\") pod \"horizon-f698cb877-vqsz6\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.335408 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.335590 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qq8m2" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.335807 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.370105 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.377091 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-4lhbd"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.416405 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-pcbqs"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.417873 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.431153 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zsslz"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.441596 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.461265 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-pcbqs"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479267 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-combined-ca-bundle\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479328 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-run-httpd\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479449 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479490 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzg9v\" (UniqueName: \"kubernetes.io/projected/2c64bfc6-5406-43fe-9f4d-744b7634d300-kube-api-access-fzg9v\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479602 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479636 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gfm\" (UniqueName: \"kubernetes.io/projected/906fbd6e-e72f-428f-b182-f583c009fc93-kube-api-access-s7gfm\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479678 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479780 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-config-data\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479802 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-config-data\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479898 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5fx\" (UniqueName: \"kubernetes.io/projected/08c9fecb-7dc9-4aed-b134-98995f1cf280-kube-api-access-lt5fx\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479941 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-scripts\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.487912 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-prf67" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.479959 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-log-httpd\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.598474 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-scripts\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.598510 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.598604 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c9fecb-7dc9-4aed-b134-98995f1cf280-logs\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.598687 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.598706 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-config\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.598758 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.630819 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-k52t8"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.650669 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.657357 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.662277 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fb95k" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.706076 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-combined-ca-bundle\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.706153 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-run-httpd\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724308 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724391 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-combined-ca-bundle\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724439 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzg9v\" (UniqueName: \"kubernetes.io/projected/2c64bfc6-5406-43fe-9f4d-744b7634d300-kube-api-access-fzg9v\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724463 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724508 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gfm\" (UniqueName: \"kubernetes.io/projected/906fbd6e-e72f-428f-b182-f583c009fc93-kube-api-access-s7gfm\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724535 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-db-sync-config-data\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724610 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724638 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-config-data\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724653 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-config-data\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724675 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5fx\" (UniqueName: \"kubernetes.io/projected/08c9fecb-7dc9-4aed-b134-98995f1cf280-kube-api-access-lt5fx\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724786 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-scripts\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724800 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-log-httpd\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724826 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-scripts\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724845 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724891 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c9fecb-7dc9-4aed-b134-98995f1cf280-logs\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724919 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cplpz\" (UniqueName: \"kubernetes.io/projected/c9958ea9-408e-4b14-8b23-dd1662654cd1-kube-api-access-cplpz\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724949 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-config\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724967 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.724991 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.725867 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.730565 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k52t8"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.731181 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.731188 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-run-httpd\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.735378 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-scripts\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.735575 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-log-httpd\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.735972 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-config-data\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.738006 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-config\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.739024 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.747352 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-config-data\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.748407 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.767226 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-combined-ca-bundle\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.768412 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.768576 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bd6c97f9c-7td27"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.772109 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gfm\" (UniqueName: \"kubernetes.io/projected/906fbd6e-e72f-428f-b182-f583c009fc93-kube-api-access-s7gfm\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.773184 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-scripts\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.773394 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c9fecb-7dc9-4aed-b134-98995f1cf280-logs\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.773803 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.774865 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzg9v\" (UniqueName: \"kubernetes.io/projected/2c64bfc6-5406-43fe-9f4d-744b7634d300-kube-api-access-fzg9v\") pod \"dnsmasq-dns-58dd9ff6bc-pcbqs\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.775281 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5fx\" (UniqueName: \"kubernetes.io/projected/08c9fecb-7dc9-4aed-b134-98995f1cf280-kube-api-access-lt5fx\") pod \"placement-db-sync-zsslz\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " pod="openstack/placement-db-sync-zsslz" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.775508 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bd6c97f9c-7td27"] Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.775945 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.798389 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.826939 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-db-sync-config-data\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.826998 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-config-data\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.827090 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw9ls\" (UniqueName: \"kubernetes.io/projected/a0e3b11b-9596-49bc-bd3e-11b74b885361-kube-api-access-gw9ls\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.827113 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-scripts\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.827133 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cplpz\" (UniqueName: \"kubernetes.io/projected/c9958ea9-408e-4b14-8b23-dd1662654cd1-kube-api-access-cplpz\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.827168 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0e3b11b-9596-49bc-bd3e-11b74b885361-horizon-secret-key\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.827197 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0e3b11b-9596-49bc-bd3e-11b74b885361-logs\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.827239 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-combined-ca-bundle\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.833237 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-combined-ca-bundle\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.840001 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-db-sync-config-data\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.852489 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cplpz\" (UniqueName: \"kubernetes.io/projected/c9958ea9-408e-4b14-8b23-dd1662654cd1-kube-api-access-cplpz\") pod \"barbican-db-sync-k52t8\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " pod="openstack/barbican-db-sync-k52t8" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.909078 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.928632 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-config-data\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.928711 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw9ls\" (UniqueName: \"kubernetes.io/projected/a0e3b11b-9596-49bc-bd3e-11b74b885361-kube-api-access-gw9ls\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.928730 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-scripts\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.928764 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0e3b11b-9596-49bc-bd3e-11b74b885361-horizon-secret-key\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.928789 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0e3b11b-9596-49bc-bd3e-11b74b885361-logs\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.930940 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0e3b11b-9596-49bc-bd3e-11b74b885361-logs\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.931668 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-scripts\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.931997 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-config-data\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.944339 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0e3b11b-9596-49bc-bd3e-11b74b885361-horizon-secret-key\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.983169 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw9ls\" (UniqueName: \"kubernetes.io/projected/a0e3b11b-9596-49bc-bd3e-11b74b885361-kube-api-access-gw9ls\") pod \"horizon-bd6c97f9c-7td27\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:48:59 crc kubenswrapper[4553]: I0930 19:48:59.986920 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zsslz" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.033585 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.080995 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k52t8" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.099972 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.133357 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-nb\") pod \"5aef0df7-8283-429d-a90f-756a236e04c2\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.133457 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-config\") pod \"5aef0df7-8283-429d-a90f-756a236e04c2\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.133539 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-svc\") pod \"5aef0df7-8283-429d-a90f-756a236e04c2\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.133573 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-sb\") pod \"5aef0df7-8283-429d-a90f-756a236e04c2\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.133607 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xg4p\" (UniqueName: \"kubernetes.io/projected/5aef0df7-8283-429d-a90f-756a236e04c2-kube-api-access-8xg4p\") pod \"5aef0df7-8283-429d-a90f-756a236e04c2\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.133648 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-swift-storage-0\") pod \"5aef0df7-8283-429d-a90f-756a236e04c2\" (UID: \"5aef0df7-8283-429d-a90f-756a236e04c2\") " Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.149568 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-df4gj"] Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.154609 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aef0df7-8283-429d-a90f-756a236e04c2-kube-api-access-8xg4p" (OuterVolumeSpecName: "kube-api-access-8xg4p") pod "5aef0df7-8283-429d-a90f-756a236e04c2" (UID: "5aef0df7-8283-429d-a90f-756a236e04c2"). InnerVolumeSpecName "kube-api-access-8xg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.162631 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-4lhbd"] Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.242466 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xg4p\" (UniqueName: \"kubernetes.io/projected/5aef0df7-8283-429d-a90f-756a236e04c2-kube-api-access-8xg4p\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.283254 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" event={"ID":"5aef0df7-8283-429d-a90f-756a236e04c2","Type":"ContainerDied","Data":"9b6ec96294ec8029cf708468372a7c49942d041bdd942a5550c3993d34965f1e"} Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.283295 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tcn9h" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.283327 4553 scope.go:117] "RemoveContainer" containerID="6bb2dfce4f24deda5ffb1bdccf1e60fd83dcb0a3ef1e3ae2918d4162f09d473b" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.302938 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5aef0df7-8283-429d-a90f-756a236e04c2" (UID: "5aef0df7-8283-429d-a90f-756a236e04c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.318561 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5aef0df7-8283-429d-a90f-756a236e04c2" (UID: "5aef0df7-8283-429d-a90f-756a236e04c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.345200 4553 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.345225 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.345430 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5aef0df7-8283-429d-a90f-756a236e04c2" (UID: "5aef0df7-8283-429d-a90f-756a236e04c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.377473 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-config" (OuterVolumeSpecName: "config") pod "5aef0df7-8283-429d-a90f-756a236e04c2" (UID: "5aef0df7-8283-429d-a90f-756a236e04c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.389068 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5aef0df7-8283-429d-a90f-756a236e04c2" (UID: "5aef0df7-8283-429d-a90f-756a236e04c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.394831 4553 scope.go:117] "RemoveContainer" containerID="328366c9eace34d1fdfa6494fc2fe530c5de569d394341b79e464a332347841d" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.397494 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-f7rgm"] Sep 30 19:49:00 crc kubenswrapper[4553]: W0930 19:49:00.402533 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1bf2fc0_8737_4258_9bf8_1978001043f9.slice/crio-414def7d452087bf14dbebdc33668a21c43ba73d7e77ec8355146aeb3247bb57 WatchSource:0}: Error finding container 414def7d452087bf14dbebdc33668a21c43ba73d7e77ec8355146aeb3247bb57: Status 404 returned error can't find the container with id 414def7d452087bf14dbebdc33668a21c43ba73d7e77ec8355146aeb3247bb57 Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.449930 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.449956 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.449965 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aef0df7-8283-429d-a90f-756a236e04c2-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.623726 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f698cb877-vqsz6"] Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.652509 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-prf67"] Sep 30 19:49:00 crc kubenswrapper[4553]: W0930 19:49:00.673134 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04f1abd5_5975_4038_98b3_4b6ff0e858f7.slice/crio-c849e2e448fb5bf4e0f6ae6bb9d6b09353372671d50d9ec4a697e6cedacca6e9 WatchSource:0}: Error finding container c849e2e448fb5bf4e0f6ae6bb9d6b09353372671d50d9ec4a697e6cedacca6e9: Status 404 returned error can't find the container with id c849e2e448fb5bf4e0f6ae6bb9d6b09353372671d50d9ec4a697e6cedacca6e9 Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.775084 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tcn9h"] Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.786204 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tcn9h"] Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.849691 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:49:00 crc kubenswrapper[4553]: I0930 19:49:00.864759 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-pcbqs"] Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.292452 4553 generic.go:334] "Generic (PLEG): container finished" podID="5b0ff91e-d7de-4edf-9206-197dea687f2f" containerID="8bfaa7951f07d2a2c8d8a36c51f5d4317170df4653f2a068460bed7bcc312599" exitCode=0 Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.292630 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" event={"ID":"5b0ff91e-d7de-4edf-9206-197dea687f2f","Type":"ContainerDied","Data":"8bfaa7951f07d2a2c8d8a36c51f5d4317170df4653f2a068460bed7bcc312599"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.292845 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" event={"ID":"5b0ff91e-d7de-4edf-9206-197dea687f2f","Type":"ContainerStarted","Data":"d4f6839aea2679e81e656229f131d0e6970e9787c0b5f22ecc8181091d1965d5"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.295830 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f7rgm" event={"ID":"d1bf2fc0-8737-4258-9bf8-1978001043f9","Type":"ContainerStarted","Data":"c35f71ed62ab9c4849e25d2f14da54779822bd9438fec58dbc0c4ac04c0373ed"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.295855 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f7rgm" event={"ID":"d1bf2fc0-8737-4258-9bf8-1978001043f9","Type":"ContainerStarted","Data":"414def7d452087bf14dbebdc33668a21c43ba73d7e77ec8355146aeb3247bb57"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.308107 4553 generic.go:334] "Generic (PLEG): container finished" podID="3f9a8e95-e61a-473d-a74f-cf7a6820ff97" containerID="8b86fa732ebe460ff0df6d1a947d9d597f10f8fc2044fc5328f14860e3a3852c" exitCode=0 Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.308181 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gw5ch" event={"ID":"3f9a8e95-e61a-473d-a74f-cf7a6820ff97","Type":"ContainerDied","Data":"8b86fa732ebe460ff0df6d1a947d9d597f10f8fc2044fc5328f14860e3a3852c"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.317020 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f698cb877-vqsz6" event={"ID":"95fc6b11-e9bf-4886-9036-8276f127b8bf","Type":"ContainerStarted","Data":"c924f01360851f3432d6ec2ee4323179d1a0a1bf888f88edd415e76f5291e0f0"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.320617 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zsslz"] Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.336258 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-prf67" event={"ID":"04f1abd5-5975-4038-98b3-4b6ff0e858f7","Type":"ContainerStarted","Data":"c849e2e448fb5bf4e0f6ae6bb9d6b09353372671d50d9ec4a697e6cedacca6e9"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.338079 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bd6c97f9c-7td27"] Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.367192 4553 generic.go:334] "Generic (PLEG): container finished" podID="2c64bfc6-5406-43fe-9f4d-744b7634d300" containerID="9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093" exitCode=0 Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.367428 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" event={"ID":"2c64bfc6-5406-43fe-9f4d-744b7634d300","Type":"ContainerDied","Data":"9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.367455 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" event={"ID":"2c64bfc6-5406-43fe-9f4d-744b7634d300","Type":"ContainerStarted","Data":"7018b39a8138f0c5c38adf1096db41befb700c8ff7e779c818ca2082687082dd"} Sep 30 19:49:01 crc kubenswrapper[4553]: W0930 19:49:01.400554 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0e3b11b_9596_49bc_bd3e_11b74b885361.slice/crio-634ae2864e052a102eb20e4e63f0c0ba7e1e9f51d819c5c7822817055ed08b09 WatchSource:0}: Error finding container 634ae2864e052a102eb20e4e63f0c0ba7e1e9f51d819c5c7822817055ed08b09: Status 404 returned error can't find the container with id 634ae2864e052a102eb20e4e63f0c0ba7e1e9f51d819c5c7822817055ed08b09 Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.457578 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-df4gj" event={"ID":"5eadc17c-def7-44ac-bafe-23adea8e696a","Type":"ContainerStarted","Data":"d49ed18512af1979da607775f33ac9e3d773a2c6383fe37bde1a20c1fdb2a781"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.457623 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-df4gj" event={"ID":"5eadc17c-def7-44ac-bafe-23adea8e696a","Type":"ContainerStarted","Data":"4a452f2558dbba5e3fb67e84ff3e8d1377af3b847da323767ee7584d24de05fa"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.459639 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerStarted","Data":"37e912b00f4c9f782f11089f0f3fac81a5cbbb16025730f787d3ab4daa71bb8d"} Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.580652 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aef0df7-8283-429d-a90f-756a236e04c2" path="/var/lib/kubelet/pods/5aef0df7-8283-429d-a90f-756a236e04c2/volumes" Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.647484 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k52t8"] Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.650905 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-f7rgm" podStartSLOduration=3.650866493 podStartE2EDuration="3.650866493s" podCreationTimestamp="2025-09-30 19:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:01.393460727 +0000 UTC m=+994.592962847" watchObservedRunningTime="2025-09-30 19:49:01.650866493 +0000 UTC m=+994.850368623" Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.731128 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-df4gj" podStartSLOduration=3.731109605 podStartE2EDuration="3.731109605s" podCreationTimestamp="2025-09-30 19:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:01.490980815 +0000 UTC m=+994.690482945" watchObservedRunningTime="2025-09-30 19:49:01.731109605 +0000 UTC m=+994.930611725" Sep 30 19:49:01 crc kubenswrapper[4553]: I0930 19:49:01.946100 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.094913 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-swift-storage-0\") pod \"5b0ff91e-d7de-4edf-9206-197dea687f2f\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.095010 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-svc\") pod \"5b0ff91e-d7de-4edf-9206-197dea687f2f\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.095061 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zss5d\" (UniqueName: \"kubernetes.io/projected/5b0ff91e-d7de-4edf-9206-197dea687f2f-kube-api-access-zss5d\") pod \"5b0ff91e-d7de-4edf-9206-197dea687f2f\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.095101 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-sb\") pod \"5b0ff91e-d7de-4edf-9206-197dea687f2f\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.095219 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-nb\") pod \"5b0ff91e-d7de-4edf-9206-197dea687f2f\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.095321 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-config\") pod \"5b0ff91e-d7de-4edf-9206-197dea687f2f\" (UID: \"5b0ff91e-d7de-4edf-9206-197dea687f2f\") " Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.108540 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0ff91e-d7de-4edf-9206-197dea687f2f-kube-api-access-zss5d" (OuterVolumeSpecName: "kube-api-access-zss5d") pod "5b0ff91e-d7de-4edf-9206-197dea687f2f" (UID: "5b0ff91e-d7de-4edf-9206-197dea687f2f"). InnerVolumeSpecName "kube-api-access-zss5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.124002 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b0ff91e-d7de-4edf-9206-197dea687f2f" (UID: "5b0ff91e-d7de-4edf-9206-197dea687f2f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.125663 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b0ff91e-d7de-4edf-9206-197dea687f2f" (UID: "5b0ff91e-d7de-4edf-9206-197dea687f2f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.127797 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5b0ff91e-d7de-4edf-9206-197dea687f2f" (UID: "5b0ff91e-d7de-4edf-9206-197dea687f2f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.129496 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b0ff91e-d7de-4edf-9206-197dea687f2f" (UID: "5b0ff91e-d7de-4edf-9206-197dea687f2f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.137215 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-config" (OuterVolumeSpecName: "config") pod "5b0ff91e-d7de-4edf-9206-197dea687f2f" (UID: "5b0ff91e-d7de-4edf-9206-197dea687f2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.197595 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.197622 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zss5d\" (UniqueName: \"kubernetes.io/projected/5b0ff91e-d7de-4edf-9206-197dea687f2f-kube-api-access-zss5d\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.197634 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.197642 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.197650 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.197658 4553 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b0ff91e-d7de-4edf-9206-197dea687f2f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.481465 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" event={"ID":"5b0ff91e-d7de-4edf-9206-197dea687f2f","Type":"ContainerDied","Data":"d4f6839aea2679e81e656229f131d0e6970e9787c0b5f22ecc8181091d1965d5"} Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.481771 4553 scope.go:117] "RemoveContainer" containerID="8bfaa7951f07d2a2c8d8a36c51f5d4317170df4653f2a068460bed7bcc312599" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.481893 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-4lhbd" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.500476 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" event={"ID":"2c64bfc6-5406-43fe-9f4d-744b7634d300","Type":"ContainerStarted","Data":"fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66"} Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.501371 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.518401 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zsslz" event={"ID":"08c9fecb-7dc9-4aed-b134-98995f1cf280","Type":"ContainerStarted","Data":"a99f92f2f006a208b306cc216ad4ef90c673dd3adf9b3adc3a9bc73bc016b7a1"} Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.533988 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd6c97f9c-7td27" event={"ID":"a0e3b11b-9596-49bc-bd3e-11b74b885361","Type":"ContainerStarted","Data":"634ae2864e052a102eb20e4e63f0c0ba7e1e9f51d819c5c7822817055ed08b09"} Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.544063 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k52t8" event={"ID":"c9958ea9-408e-4b14-8b23-dd1662654cd1","Type":"ContainerStarted","Data":"31d5a871755fb18d5b9ea89b5e5a3bd1fc66efa43b7c0b7836db9e1f04bfcafd"} Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.546500 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" podStartSLOduration=3.546485184 podStartE2EDuration="3.546485184s" podCreationTimestamp="2025-09-30 19:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:02.540116652 +0000 UTC m=+995.739618782" watchObservedRunningTime="2025-09-30 19:49:02.546485184 +0000 UTC m=+995.745987314" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.665077 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-4lhbd"] Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.667010 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-4lhbd"] Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.747216 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f698cb877-vqsz6"] Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.804111 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.828197 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6cbc5c6fc7-8grv2"] Sep 30 19:49:02 crc kubenswrapper[4553]: E0930 19:49:02.828647 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0ff91e-d7de-4edf-9206-197dea687f2f" containerName="init" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.828663 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0ff91e-d7de-4edf-9206-197dea687f2f" containerName="init" Sep 30 19:49:02 crc kubenswrapper[4553]: E0930 19:49:02.828681 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aef0df7-8283-429d-a90f-756a236e04c2" containerName="init" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.828690 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aef0df7-8283-429d-a90f-756a236e04c2" containerName="init" Sep 30 19:49:02 crc kubenswrapper[4553]: E0930 19:49:02.828721 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aef0df7-8283-429d-a90f-756a236e04c2" containerName="dnsmasq-dns" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.828728 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aef0df7-8283-429d-a90f-756a236e04c2" containerName="dnsmasq-dns" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.828885 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0ff91e-d7de-4edf-9206-197dea687f2f" containerName="init" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.828907 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aef0df7-8283-429d-a90f-756a236e04c2" containerName="dnsmasq-dns" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.829862 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:02 crc kubenswrapper[4553]: I0930 19:49:02.868778 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cbc5c6fc7-8grv2"] Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.019836 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-config-data\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.020439 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-scripts\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.020528 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlqnz\" (UniqueName: \"kubernetes.io/projected/568e1d06-f892-49fc-8cdf-ee433c1cee17-kube-api-access-vlqnz\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.020598 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/568e1d06-f892-49fc-8cdf-ee433c1cee17-horizon-secret-key\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.020664 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/568e1d06-f892-49fc-8cdf-ee433c1cee17-logs\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.126088 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-config-data\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.126180 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-scripts\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.126210 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlqnz\" (UniqueName: \"kubernetes.io/projected/568e1d06-f892-49fc-8cdf-ee433c1cee17-kube-api-access-vlqnz\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.126231 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/568e1d06-f892-49fc-8cdf-ee433c1cee17-horizon-secret-key\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.126259 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/568e1d06-f892-49fc-8cdf-ee433c1cee17-logs\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.126699 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/568e1d06-f892-49fc-8cdf-ee433c1cee17-logs\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.127274 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-scripts\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.129639 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-config-data\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.157132 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlqnz\" (UniqueName: \"kubernetes.io/projected/568e1d06-f892-49fc-8cdf-ee433c1cee17-kube-api-access-vlqnz\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.165083 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/568e1d06-f892-49fc-8cdf-ee433c1cee17-horizon-secret-key\") pod \"horizon-6cbc5c6fc7-8grv2\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.173007 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.299438 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gw5ch" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.437720 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-config-data\") pod \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.437975 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-db-sync-config-data\") pod \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.438012 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs6qb\" (UniqueName: \"kubernetes.io/projected/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-kube-api-access-vs6qb\") pod \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.438083 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-combined-ca-bundle\") pod \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\" (UID: \"3f9a8e95-e61a-473d-a74f-cf7a6820ff97\") " Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.453319 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3f9a8e95-e61a-473d-a74f-cf7a6820ff97" (UID: "3f9a8e95-e61a-473d-a74f-cf7a6820ff97"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.461509 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-kube-api-access-vs6qb" (OuterVolumeSpecName: "kube-api-access-vs6qb") pod "3f9a8e95-e61a-473d-a74f-cf7a6820ff97" (UID: "3f9a8e95-e61a-473d-a74f-cf7a6820ff97"). InnerVolumeSpecName "kube-api-access-vs6qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.469139 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f9a8e95-e61a-473d-a74f-cf7a6820ff97" (UID: "3f9a8e95-e61a-473d-a74f-cf7a6820ff97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.545132 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.545151 4553 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.545160 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs6qb\" (UniqueName: \"kubernetes.io/projected/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-kube-api-access-vs6qb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.545247 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-config-data" (OuterVolumeSpecName: "config-data") pod "3f9a8e95-e61a-473d-a74f-cf7a6820ff97" (UID: "3f9a8e95-e61a-473d-a74f-cf7a6820ff97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.581640 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gw5ch" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.625732 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0ff91e-d7de-4edf-9206-197dea687f2f" path="/var/lib/kubelet/pods/5b0ff91e-d7de-4edf-9206-197dea687f2f/volumes" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.626970 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gw5ch" event={"ID":"3f9a8e95-e61a-473d-a74f-cf7a6820ff97","Type":"ContainerDied","Data":"38b62e122cbca5b98fb812459e852f9d52657eb1f8d17a8e156249048fa7772f"} Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.626997 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38b62e122cbca5b98fb812459e852f9d52657eb1f8d17a8e156249048fa7772f" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.650385 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9a8e95-e61a-473d-a74f-cf7a6820ff97-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.754633 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cbc5c6fc7-8grv2"] Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.888527 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-pcbqs"] Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.944189 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rcrn7"] Sep 30 19:49:03 crc kubenswrapper[4553]: E0930 19:49:03.944566 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9a8e95-e61a-473d-a74f-cf7a6820ff97" containerName="glance-db-sync" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.944578 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9a8e95-e61a-473d-a74f-cf7a6820ff97" containerName="glance-db-sync" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.944754 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9a8e95-e61a-473d-a74f-cf7a6820ff97" containerName="glance-db-sync" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.945606 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:03 crc kubenswrapper[4553]: I0930 19:49:03.991492 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rcrn7"] Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.059085 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.059180 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.059300 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckbbf\" (UniqueName: \"kubernetes.io/projected/8ab14a4f-024a-4e42-96a2-6ca958df01f5-kube-api-access-ckbbf\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.059326 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.059388 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-config\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.059453 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.162055 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.162126 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.162165 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.162224 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.162242 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckbbf\" (UniqueName: \"kubernetes.io/projected/8ab14a4f-024a-4e42-96a2-6ca958df01f5-kube-api-access-ckbbf\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.162271 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-config\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.164053 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.164892 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.164955 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.165325 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-config\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.165420 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.191993 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckbbf\" (UniqueName: \"kubernetes.io/projected/8ab14a4f-024a-4e42-96a2-6ca958df01f5-kube-api-access-ckbbf\") pod \"dnsmasq-dns-785d8bcb8c-rcrn7\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.280844 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.675214 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cbc5c6fc7-8grv2" event={"ID":"568e1d06-f892-49fc-8cdf-ee433c1cee17","Type":"ContainerStarted","Data":"d701cd9e1f42243b8b2774a54988d917cd31ffb0125243e6e51bf17907f52245"} Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.715031 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.716544 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.719635 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.719809 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.719939 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zv45w" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.726617 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.875180 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.875319 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.875338 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.875593 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2nlr\" (UniqueName: \"kubernetes.io/projected/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-kube-api-access-d2nlr\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.875762 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.875868 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-logs\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.875972 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.978427 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.978465 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.978490 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2nlr\" (UniqueName: \"kubernetes.io/projected/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-kube-api-access-d2nlr\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.978529 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.978554 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-logs\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.978595 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.978625 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.984739 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.987004 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.987470 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.988802 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.988810 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-logs\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:04 crc kubenswrapper[4553]: I0930 19:49:04.990240 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.008932 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2nlr\" (UniqueName: \"kubernetes.io/projected/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-kube-api-access-d2nlr\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.033009 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.049168 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rcrn7"] Sep 30 19:49:05 crc kubenswrapper[4553]: W0930 19:49:05.064326 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ab14a4f_024a_4e42_96a2_6ca958df01f5.slice/crio-52d6307a7def104c10fc939cfee7d8820cf9982f43599ff93d67bc821c0d0c33 WatchSource:0}: Error finding container 52d6307a7def104c10fc939cfee7d8820cf9982f43599ff93d67bc821c0d0c33: Status 404 returned error can't find the container with id 52d6307a7def104c10fc939cfee7d8820cf9982f43599ff93d67bc821c0d0c33 Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.324224 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.325505 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.330076 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.336412 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.338253 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.490515 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.490989 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.491199 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk2nt\" (UniqueName: \"kubernetes.io/projected/c3b3f976-3c25-4141-b586-2f06390b1a7a-kube-api-access-jk2nt\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.491268 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.491338 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.491367 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.491477 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.597485 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.597587 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.597613 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.597655 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk2nt\" (UniqueName: \"kubernetes.io/projected/c3b3f976-3c25-4141-b586-2f06390b1a7a-kube-api-access-jk2nt\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.597682 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.597729 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.597758 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.598510 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.599552 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.599784 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.619282 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.619297 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.622717 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.629388 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.643536 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk2nt\" (UniqueName: \"kubernetes.io/projected/c3b3f976-3c25-4141-b586-2f06390b1a7a-kube-api-access-jk2nt\") pod \"glance-default-internal-api-0\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.663344 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.745599 4553 generic.go:334] "Generic (PLEG): container finished" podID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" containerID="ae2c0e9e2a23f40fe2430e153a1ed8ff54b7e8439afa5d67168fea8d1931ff83" exitCode=0 Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.745845 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" podUID="2c64bfc6-5406-43fe-9f4d-744b7634d300" containerName="dnsmasq-dns" containerID="cri-o://fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66" gracePeriod=10 Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.746847 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" event={"ID":"8ab14a4f-024a-4e42-96a2-6ca958df01f5","Type":"ContainerDied","Data":"ae2c0e9e2a23f40fe2430e153a1ed8ff54b7e8439afa5d67168fea8d1931ff83"} Sep 30 19:49:05 crc kubenswrapper[4553]: I0930 19:49:05.748890 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" event={"ID":"8ab14a4f-024a-4e42-96a2-6ca958df01f5","Type":"ContainerStarted","Data":"52d6307a7def104c10fc939cfee7d8820cf9982f43599ff93d67bc821c0d0c33"} Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.109464 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.287366 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.434472 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-config\") pod \"2c64bfc6-5406-43fe-9f4d-744b7634d300\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.434618 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzg9v\" (UniqueName: \"kubernetes.io/projected/2c64bfc6-5406-43fe-9f4d-744b7634d300-kube-api-access-fzg9v\") pod \"2c64bfc6-5406-43fe-9f4d-744b7634d300\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.434679 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-swift-storage-0\") pod \"2c64bfc6-5406-43fe-9f4d-744b7634d300\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.434702 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-sb\") pod \"2c64bfc6-5406-43fe-9f4d-744b7634d300\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.434798 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-svc\") pod \"2c64bfc6-5406-43fe-9f4d-744b7634d300\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.434836 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-nb\") pod \"2c64bfc6-5406-43fe-9f4d-744b7634d300\" (UID: \"2c64bfc6-5406-43fe-9f4d-744b7634d300\") " Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.446363 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c64bfc6-5406-43fe-9f4d-744b7634d300-kube-api-access-fzg9v" (OuterVolumeSpecName: "kube-api-access-fzg9v") pod "2c64bfc6-5406-43fe-9f4d-744b7634d300" (UID: "2c64bfc6-5406-43fe-9f4d-744b7634d300"). InnerVolumeSpecName "kube-api-access-fzg9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.516903 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c64bfc6-5406-43fe-9f4d-744b7634d300" (UID: "2c64bfc6-5406-43fe-9f4d-744b7634d300"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.538253 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzg9v\" (UniqueName: \"kubernetes.io/projected/2c64bfc6-5406-43fe-9f4d-744b7634d300-kube-api-access-fzg9v\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.538280 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.553678 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-config" (OuterVolumeSpecName: "config") pod "2c64bfc6-5406-43fe-9f4d-744b7634d300" (UID: "2c64bfc6-5406-43fe-9f4d-744b7634d300"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.586606 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c64bfc6-5406-43fe-9f4d-744b7634d300" (UID: "2c64bfc6-5406-43fe-9f4d-744b7634d300"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.587929 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c64bfc6-5406-43fe-9f4d-744b7634d300" (UID: "2c64bfc6-5406-43fe-9f4d-744b7634d300"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.605268 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c64bfc6-5406-43fe-9f4d-744b7634d300" (UID: "2c64bfc6-5406-43fe-9f4d-744b7634d300"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.640733 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.641091 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.641101 4553 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.641111 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c64bfc6-5406-43fe-9f4d-744b7634d300-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.696076 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:06 crc kubenswrapper[4553]: W0930 19:49:06.710990 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3b3f976_3c25_4141_b586_2f06390b1a7a.slice/crio-f51cb0446b3a84dc865387d5f4ec80caa9eb9232d47d43838a8cc90d1e138ac8 WatchSource:0}: Error finding container f51cb0446b3a84dc865387d5f4ec80caa9eb9232d47d43838a8cc90d1e138ac8: Status 404 returned error can't find the container with id f51cb0446b3a84dc865387d5f4ec80caa9eb9232d47d43838a8cc90d1e138ac8 Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.783001 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" event={"ID":"8ab14a4f-024a-4e42-96a2-6ca958df01f5","Type":"ContainerStarted","Data":"d8169d85f4db9e47515e590e5c07105ad5c3777fd24627357301b6121802eeb8"} Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.783121 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.802583 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3b3f976-3c25-4141-b586-2f06390b1a7a","Type":"ContainerStarted","Data":"f51cb0446b3a84dc865387d5f4ec80caa9eb9232d47d43838a8cc90d1e138ac8"} Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.805912 4553 generic.go:334] "Generic (PLEG): container finished" podID="2c64bfc6-5406-43fe-9f4d-744b7634d300" containerID="fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66" exitCode=0 Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.805957 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" event={"ID":"2c64bfc6-5406-43fe-9f4d-744b7634d300","Type":"ContainerDied","Data":"fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66"} Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.805975 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" event={"ID":"2c64bfc6-5406-43fe-9f4d-744b7634d300","Type":"ContainerDied","Data":"7018b39a8138f0c5c38adf1096db41befb700c8ff7e779c818ca2082687082dd"} Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.805990 4553 scope.go:117] "RemoveContainer" containerID="fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.806167 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-pcbqs" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.815222 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c34ed0d-a9d2-4b7a-a006-8486a35d5502","Type":"ContainerStarted","Data":"0bc8d426b897921641df1c93ca9a520d21eec2907eb8815a257ac1cfd9c2249d"} Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.830460 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" podStartSLOduration=3.8304383570000002 podStartE2EDuration="3.830438357s" podCreationTimestamp="2025-09-30 19:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:06.821565698 +0000 UTC m=+1000.021067828" watchObservedRunningTime="2025-09-30 19:49:06.830438357 +0000 UTC m=+1000.029940487" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.864865 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-pcbqs"] Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.877840 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-pcbqs"] Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.901490 4553 scope.go:117] "RemoveContainer" containerID="9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.978048 4553 scope.go:117] "RemoveContainer" containerID="fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66" Sep 30 19:49:06 crc kubenswrapper[4553]: E0930 19:49:06.978905 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66\": container with ID starting with fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66 not found: ID does not exist" containerID="fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.978955 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66"} err="failed to get container status \"fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66\": rpc error: code = NotFound desc = could not find container \"fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66\": container with ID starting with fb91c056426a313d982c39ff456a49e663bcca9a1f40ff117a47f312445aea66 not found: ID does not exist" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.978985 4553 scope.go:117] "RemoveContainer" containerID="9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093" Sep 30 19:49:06 crc kubenswrapper[4553]: E0930 19:49:06.980578 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093\": container with ID starting with 9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093 not found: ID does not exist" containerID="9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093" Sep 30 19:49:06 crc kubenswrapper[4553]: I0930 19:49:06.980610 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093"} err="failed to get container status \"9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093\": rpc error: code = NotFound desc = could not find container \"9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093\": container with ID starting with 9d2960034063dcb84597c82212d614cf4291681107345ef366eb051a3bb6e093 not found: ID does not exist" Sep 30 19:49:07 crc kubenswrapper[4553]: I0930 19:49:07.598726 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c64bfc6-5406-43fe-9f4d-744b7634d300" path="/var/lib/kubelet/pods/2c64bfc6-5406-43fe-9f4d-744b7634d300/volumes" Sep 30 19:49:07 crc kubenswrapper[4553]: I0930 19:49:07.926527 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c34ed0d-a9d2-4b7a-a006-8486a35d5502","Type":"ContainerStarted","Data":"5c1c15be62b603d014e90280125c05ea76a8d65a111433410684c3b49722d842"} Sep 30 19:49:08 crc kubenswrapper[4553]: I0930 19:49:08.936140 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3b3f976-3c25-4141-b586-2f06390b1a7a","Type":"ContainerStarted","Data":"1f48e20b8965f6f40e7fa346cf6d36459b18112ea605dad8898760d2f9807886"} Sep 30 19:49:08 crc kubenswrapper[4553]: I0930 19:49:08.938722 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c34ed0d-a9d2-4b7a-a006-8486a35d5502","Type":"ContainerStarted","Data":"f14d66411a0bdc931a360b08dfb245604acb636183f17cdd0123b9bb7734cff6"} Sep 30 19:49:08 crc kubenswrapper[4553]: I0930 19:49:08.940822 4553 generic.go:334] "Generic (PLEG): container finished" podID="5eadc17c-def7-44ac-bafe-23adea8e696a" containerID="d49ed18512af1979da607775f33ac9e3d773a2c6383fe37bde1a20c1fdb2a781" exitCode=0 Sep 30 19:49:08 crc kubenswrapper[4553]: I0930 19:49:08.940852 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-df4gj" event={"ID":"5eadc17c-def7-44ac-bafe-23adea8e696a","Type":"ContainerDied","Data":"d49ed18512af1979da607775f33ac9e3d773a2c6383fe37bde1a20c1fdb2a781"} Sep 30 19:49:08 crc kubenswrapper[4553]: I0930 19:49:08.962939 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.962922203 podStartE2EDuration="5.962922203s" podCreationTimestamp="2025-09-30 19:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:08.95391362 +0000 UTC m=+1002.153415750" watchObservedRunningTime="2025-09-30 19:49:08.962922203 +0000 UTC m=+1002.162424333" Sep 30 19:49:09 crc kubenswrapper[4553]: I0930 19:49:09.954521 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3b3f976-3c25-4141-b586-2f06390b1a7a","Type":"ContainerStarted","Data":"c77b0d6e720966437e74019f162076e7fcc1d5b7ea79aa40372565c330f7407f"} Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.017057 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.017030027 podStartE2EDuration="7.017030027s" podCreationTimestamp="2025-09-30 19:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:09.983815479 +0000 UTC m=+1003.183317619" watchObservedRunningTime="2025-09-30 19:49:11.017030027 +0000 UTC m=+1004.216532157" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.022410 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bd6c97f9c-7td27"] Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.076058 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84c849768b-8k9mh"] Sep 30 19:49:11 crc kubenswrapper[4553]: E0930 19:49:11.076418 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c64bfc6-5406-43fe-9f4d-744b7634d300" containerName="dnsmasq-dns" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.076430 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c64bfc6-5406-43fe-9f4d-744b7634d300" containerName="dnsmasq-dns" Sep 30 19:49:11 crc kubenswrapper[4553]: E0930 19:49:11.076448 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c64bfc6-5406-43fe-9f4d-744b7634d300" containerName="init" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.076454 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c64bfc6-5406-43fe-9f4d-744b7634d300" containerName="init" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.076636 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c64bfc6-5406-43fe-9f4d-744b7634d300" containerName="dnsmasq-dns" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.077566 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.081239 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.106862 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.128718 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84c849768b-8k9mh"] Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.154803 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cbc5c6fc7-8grv2"] Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.198332 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.198580 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerName="glance-log" containerID="cri-o://5c1c15be62b603d014e90280125c05ea76a8d65a111433410684c3b49722d842" gracePeriod=30 Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.198957 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerName="glance-httpd" containerID="cri-o://f14d66411a0bdc931a360b08dfb245604acb636183f17cdd0123b9bb7734cff6" gracePeriod=30 Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.208811 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-868c6b469d-rhw7t"] Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.212421 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.213258 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-scripts\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.213332 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-combined-ca-bundle\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.213463 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rshk\" (UniqueName: \"kubernetes.io/projected/17921f25-bee1-4e2e-a9e2-50669133664e-kube-api-access-9rshk\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.213498 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17921f25-bee1-4e2e-a9e2-50669133664e-logs\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.213570 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-secret-key\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.216423 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-tls-certs\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.216555 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-config-data\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.242654 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-868c6b469d-rhw7t"] Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319163 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-scripts\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319212 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-combined-ca-bundle\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319251 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-horizon-tls-certs\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319290 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849f4ec8-2741-4c83-82d8-135a24b43447-logs\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319307 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvwm6\" (UniqueName: \"kubernetes.io/projected/849f4ec8-2741-4c83-82d8-135a24b43447-kube-api-access-fvwm6\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319322 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rshk\" (UniqueName: \"kubernetes.io/projected/17921f25-bee1-4e2e-a9e2-50669133664e-kube-api-access-9rshk\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319359 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17921f25-bee1-4e2e-a9e2-50669133664e-logs\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319413 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-secret-key\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319446 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-tls-certs\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319479 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-config-data\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319502 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849f4ec8-2741-4c83-82d8-135a24b43447-scripts\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319524 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-combined-ca-bundle\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319555 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-horizon-secret-key\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.319570 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849f4ec8-2741-4c83-82d8-135a24b43447-config-data\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.320191 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-scripts\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.320337 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17921f25-bee1-4e2e-a9e2-50669133664e-logs\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.321376 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-config-data\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.326358 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-secret-key\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.328293 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-combined-ca-bundle\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.330358 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-tls-certs\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.352659 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rshk\" (UniqueName: \"kubernetes.io/projected/17921f25-bee1-4e2e-a9e2-50669133664e-kube-api-access-9rshk\") pod \"horizon-84c849768b-8k9mh\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.424977 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-horizon-tls-certs\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.425048 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849f4ec8-2741-4c83-82d8-135a24b43447-logs\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.425068 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvwm6\" (UniqueName: \"kubernetes.io/projected/849f4ec8-2741-4c83-82d8-135a24b43447-kube-api-access-fvwm6\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.425150 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849f4ec8-2741-4c83-82d8-135a24b43447-scripts\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.425172 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-combined-ca-bundle\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.425204 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-horizon-secret-key\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.425220 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849f4ec8-2741-4c83-82d8-135a24b43447-config-data\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.426498 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.426621 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/849f4ec8-2741-4c83-82d8-135a24b43447-logs\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.426797 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849f4ec8-2741-4c83-82d8-135a24b43447-config-data\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.426999 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/849f4ec8-2741-4c83-82d8-135a24b43447-scripts\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.429607 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-horizon-secret-key\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.430391 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-combined-ca-bundle\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.430604 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/849f4ec8-2741-4c83-82d8-135a24b43447-horizon-tls-certs\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.442801 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvwm6\" (UniqueName: \"kubernetes.io/projected/849f4ec8-2741-4c83-82d8-135a24b43447-kube-api-access-fvwm6\") pod \"horizon-868c6b469d-rhw7t\" (UID: \"849f4ec8-2741-4c83-82d8-135a24b43447\") " pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.553591 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.990107 4553 generic.go:334] "Generic (PLEG): container finished" podID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerID="f14d66411a0bdc931a360b08dfb245604acb636183f17cdd0123b9bb7734cff6" exitCode=0 Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.990332 4553 generic.go:334] "Generic (PLEG): container finished" podID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerID="5c1c15be62b603d014e90280125c05ea76a8d65a111433410684c3b49722d842" exitCode=143 Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.990496 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerName="glance-log" containerID="cri-o://1f48e20b8965f6f40e7fa346cf6d36459b18112ea605dad8898760d2f9807886" gracePeriod=30 Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.990729 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c34ed0d-a9d2-4b7a-a006-8486a35d5502","Type":"ContainerDied","Data":"f14d66411a0bdc931a360b08dfb245604acb636183f17cdd0123b9bb7734cff6"} Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.990755 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c34ed0d-a9d2-4b7a-a006-8486a35d5502","Type":"ContainerDied","Data":"5c1c15be62b603d014e90280125c05ea76a8d65a111433410684c3b49722d842"} Sep 30 19:49:11 crc kubenswrapper[4553]: I0930 19:49:11.990980 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerName="glance-httpd" containerID="cri-o://c77b0d6e720966437e74019f162076e7fcc1d5b7ea79aa40372565c330f7407f" gracePeriod=30 Sep 30 19:49:13 crc kubenswrapper[4553]: I0930 19:49:13.011713 4553 generic.go:334] "Generic (PLEG): container finished" podID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerID="c77b0d6e720966437e74019f162076e7fcc1d5b7ea79aa40372565c330f7407f" exitCode=0 Sep 30 19:49:13 crc kubenswrapper[4553]: I0930 19:49:13.012620 4553 generic.go:334] "Generic (PLEG): container finished" podID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerID="1f48e20b8965f6f40e7fa346cf6d36459b18112ea605dad8898760d2f9807886" exitCode=143 Sep 30 19:49:13 crc kubenswrapper[4553]: I0930 19:49:13.011845 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3b3f976-3c25-4141-b586-2f06390b1a7a","Type":"ContainerDied","Data":"c77b0d6e720966437e74019f162076e7fcc1d5b7ea79aa40372565c330f7407f"} Sep 30 19:49:13 crc kubenswrapper[4553]: I0930 19:49:13.012670 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3b3f976-3c25-4141-b586-2f06390b1a7a","Type":"ContainerDied","Data":"1f48e20b8965f6f40e7fa346cf6d36459b18112ea605dad8898760d2f9807886"} Sep 30 19:49:14 crc kubenswrapper[4553]: I0930 19:49:14.282181 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:14 crc kubenswrapper[4553]: I0930 19:49:14.348967 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6mpwk"] Sep 30 19:49:14 crc kubenswrapper[4553]: I0930 19:49:14.349185 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-6mpwk" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" containerID="cri-o://42d2955b10ec7574c2f79a48f67fd3defe5c6fca5fab753f3f084a8aab1730c0" gracePeriod=10 Sep 30 19:49:14 crc kubenswrapper[4553]: I0930 19:49:14.450640 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6mpwk" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Sep 30 19:49:15 crc kubenswrapper[4553]: I0930 19:49:15.042763 4553 generic.go:334] "Generic (PLEG): container finished" podID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerID="42d2955b10ec7574c2f79a48f67fd3defe5c6fca5fab753f3f084a8aab1730c0" exitCode=0 Sep 30 19:49:15 crc kubenswrapper[4553]: I0930 19:49:15.042800 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6mpwk" event={"ID":"7cde397a-0d8e-416d-8ac5-6051a5db9878","Type":"ContainerDied","Data":"42d2955b10ec7574c2f79a48f67fd3defe5c6fca5fab753f3f084a8aab1730c0"} Sep 30 19:49:19 crc kubenswrapper[4553]: I0930 19:49:19.445506 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6mpwk" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.873631 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.874017 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n688h6h649h64fh57ch55dhb5h597hc5h58h5ch64ch5b6h697h68h66fh596hddh98h58fhd7h5cdh689h588h597h57ch7dh645h68ch7ch68chf5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gw9ls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-bd6c97f9c-7td27_openstack(a0e3b11b-9596-49bc-bd3e-11b74b885361): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.903617 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-bd6c97f9c-7td27" podUID="a0e3b11b-9596-49bc-bd3e-11b74b885361" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.935202 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.935392 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58fh96h64dh565h699h685h5c7h98h8ch564h65chbh5ch546h554h647hbh5c7h75h586h66dh664h686h556h65dh597h5ffh695h577h669h66h577q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vlqnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6cbc5c6fc7-8grv2_openstack(568e1d06-f892-49fc-8cdf-ee433c1cee17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.936321 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.936424 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7dh57h97h59h545h5c9h649hb5hc5h95h64bh68dh647h5f6h54dh5d6hdch7fh585hf4h57ch5f7h668h68dh58ch5d8h588h68bh54h544h5dbhc5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gkq4h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-f698cb877-vqsz6_openstack(95fc6b11-e9bf-4886-9036-8276f127b8bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.937876 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6cbc5c6fc7-8grv2" podUID="568e1d06-f892-49fc-8cdf-ee433c1cee17" Sep 30 19:49:20 crc kubenswrapper[4553]: E0930 19:49:20.938994 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-f698cb877-vqsz6" podUID="95fc6b11-e9bf-4886-9036-8276f127b8bf" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.012655 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.097413 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-df4gj" event={"ID":"5eadc17c-def7-44ac-bafe-23adea8e696a","Type":"ContainerDied","Data":"4a452f2558dbba5e3fb67e84ff3e8d1377af3b847da323767ee7584d24de05fa"} Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.097457 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a452f2558dbba5e3fb67e84ff3e8d1377af3b847da323767ee7584d24de05fa" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.097522 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-df4gj" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.103090 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwst4\" (UniqueName: \"kubernetes.io/projected/5eadc17c-def7-44ac-bafe-23adea8e696a-kube-api-access-qwst4\") pod \"5eadc17c-def7-44ac-bafe-23adea8e696a\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.103133 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-config-data\") pod \"5eadc17c-def7-44ac-bafe-23adea8e696a\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.103245 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-combined-ca-bundle\") pod \"5eadc17c-def7-44ac-bafe-23adea8e696a\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.103291 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-fernet-keys\") pod \"5eadc17c-def7-44ac-bafe-23adea8e696a\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.103398 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-scripts\") pod \"5eadc17c-def7-44ac-bafe-23adea8e696a\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.108783 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-scripts" (OuterVolumeSpecName: "scripts") pod "5eadc17c-def7-44ac-bafe-23adea8e696a" (UID: "5eadc17c-def7-44ac-bafe-23adea8e696a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.111229 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eadc17c-def7-44ac-bafe-23adea8e696a-kube-api-access-qwst4" (OuterVolumeSpecName: "kube-api-access-qwst4") pod "5eadc17c-def7-44ac-bafe-23adea8e696a" (UID: "5eadc17c-def7-44ac-bafe-23adea8e696a"). InnerVolumeSpecName "kube-api-access-qwst4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.112760 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-credential-keys\") pod \"5eadc17c-def7-44ac-bafe-23adea8e696a\" (UID: \"5eadc17c-def7-44ac-bafe-23adea8e696a\") " Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.117659 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5eadc17c-def7-44ac-bafe-23adea8e696a" (UID: "5eadc17c-def7-44ac-bafe-23adea8e696a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.120745 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5eadc17c-def7-44ac-bafe-23adea8e696a" (UID: "5eadc17c-def7-44ac-bafe-23adea8e696a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.132215 4553 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.132482 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwst4\" (UniqueName: \"kubernetes.io/projected/5eadc17c-def7-44ac-bafe-23adea8e696a-kube-api-access-qwst4\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.132560 4553 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.132718 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.154985 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-config-data" (OuterVolumeSpecName: "config-data") pod "5eadc17c-def7-44ac-bafe-23adea8e696a" (UID: "5eadc17c-def7-44ac-bafe-23adea8e696a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.207305 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5eadc17c-def7-44ac-bafe-23adea8e696a" (UID: "5eadc17c-def7-44ac-bafe-23adea8e696a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.238265 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:21 crc kubenswrapper[4553]: I0930 19:49:21.238297 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eadc17c-def7-44ac-bafe-23adea8e696a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.108205 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-df4gj"] Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.117271 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-df4gj"] Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.199769 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pk229"] Sep 30 19:49:22 crc kubenswrapper[4553]: E0930 19:49:22.200187 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eadc17c-def7-44ac-bafe-23adea8e696a" containerName="keystone-bootstrap" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.200205 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eadc17c-def7-44ac-bafe-23adea8e696a" containerName="keystone-bootstrap" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.200370 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eadc17c-def7-44ac-bafe-23adea8e696a" containerName="keystone-bootstrap" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.200974 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.203669 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.204255 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.204323 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hswpl" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.204351 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.207463 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pk229"] Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.360582 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-scripts\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.360675 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq7zj\" (UniqueName: \"kubernetes.io/projected/2633e01b-c518-4077-af93-7ba213150186-kube-api-access-cq7zj\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.360767 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-fernet-keys\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.360809 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-credential-keys\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.360840 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-combined-ca-bundle\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.360887 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-config-data\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.463176 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-fernet-keys\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.463246 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-credential-keys\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.463292 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-combined-ca-bundle\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.463343 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-config-data\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.463412 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-scripts\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.464312 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq7zj\" (UniqueName: \"kubernetes.io/projected/2633e01b-c518-4077-af93-7ba213150186-kube-api-access-cq7zj\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.468983 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-combined-ca-bundle\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.469303 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-config-data\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.469609 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-scripts\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.471094 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-fernet-keys\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.480548 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq7zj\" (UniqueName: \"kubernetes.io/projected/2633e01b-c518-4077-af93-7ba213150186-kube-api-access-cq7zj\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.486319 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-credential-keys\") pod \"keystone-bootstrap-pk229\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:22 crc kubenswrapper[4553]: I0930 19:49:22.522058 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:23 crc kubenswrapper[4553]: I0930 19:49:23.515836 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eadc17c-def7-44ac-bafe-23adea8e696a" path="/var/lib/kubelet/pods/5eadc17c-def7-44ac-bafe-23adea8e696a/volumes" Sep 30 19:49:24 crc kubenswrapper[4553]: I0930 19:49:24.446302 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6mpwk" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Sep 30 19:49:24 crc kubenswrapper[4553]: I0930 19:49:24.446495 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:49:24 crc kubenswrapper[4553]: E0930 19:49:24.669419 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Sep 30 19:49:24 crc kubenswrapper[4553]: E0930 19:49:24.669597 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lt5fx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-zsslz_openstack(08c9fecb-7dc9-4aed-b134-98995f1cf280): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:49:24 crc kubenswrapper[4553]: E0930 19:49:24.671452 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-zsslz" podUID="08c9fecb-7dc9-4aed-b134-98995f1cf280" Sep 30 19:49:25 crc kubenswrapper[4553]: E0930 19:49:25.132905 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-zsslz" podUID="08c9fecb-7dc9-4aed-b134-98995f1cf280" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.784126 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.794764 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.798507 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.809088 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.873850 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-scripts\") pod \"a0e3b11b-9596-49bc-bd3e-11b74b885361\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.874057 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-config-data\") pod \"a0e3b11b-9596-49bc-bd3e-11b74b885361\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.874078 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0e3b11b-9596-49bc-bd3e-11b74b885361-logs\") pod \"a0e3b11b-9596-49bc-bd3e-11b74b885361\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.874107 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0e3b11b-9596-49bc-bd3e-11b74b885361-horizon-secret-key\") pod \"a0e3b11b-9596-49bc-bd3e-11b74b885361\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.874168 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw9ls\" (UniqueName: \"kubernetes.io/projected/a0e3b11b-9596-49bc-bd3e-11b74b885361-kube-api-access-gw9ls\") pod \"a0e3b11b-9596-49bc-bd3e-11b74b885361\" (UID: \"a0e3b11b-9596-49bc-bd3e-11b74b885361\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.875671 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-config-data" (OuterVolumeSpecName: "config-data") pod "a0e3b11b-9596-49bc-bd3e-11b74b885361" (UID: "a0e3b11b-9596-49bc-bd3e-11b74b885361"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.876301 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-scripts" (OuterVolumeSpecName: "scripts") pod "a0e3b11b-9596-49bc-bd3e-11b74b885361" (UID: "a0e3b11b-9596-49bc-bd3e-11b74b885361"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.876634 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0e3b11b-9596-49bc-bd3e-11b74b885361-logs" (OuterVolumeSpecName: "logs") pod "a0e3b11b-9596-49bc-bd3e-11b74b885361" (UID: "a0e3b11b-9596-49bc-bd3e-11b74b885361"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.879973 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e3b11b-9596-49bc-bd3e-11b74b885361-kube-api-access-gw9ls" (OuterVolumeSpecName: "kube-api-access-gw9ls") pod "a0e3b11b-9596-49bc-bd3e-11b74b885361" (UID: "a0e3b11b-9596-49bc-bd3e-11b74b885361"). InnerVolumeSpecName "kube-api-access-gw9ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.889549 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e3b11b-9596-49bc-bd3e-11b74b885361-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a0e3b11b-9596-49bc-bd3e-11b74b885361" (UID: "a0e3b11b-9596-49bc-bd3e-11b74b885361"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975152 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlqnz\" (UniqueName: \"kubernetes.io/projected/568e1d06-f892-49fc-8cdf-ee433c1cee17-kube-api-access-vlqnz\") pod \"568e1d06-f892-49fc-8cdf-ee433c1cee17\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975228 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95fc6b11-e9bf-4886-9036-8276f127b8bf-horizon-secret-key\") pod \"95fc6b11-e9bf-4886-9036-8276f127b8bf\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975263 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2nlr\" (UniqueName: \"kubernetes.io/projected/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-kube-api-access-d2nlr\") pod \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975303 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-httpd-run\") pod \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975355 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/568e1d06-f892-49fc-8cdf-ee433c1cee17-logs\") pod \"568e1d06-f892-49fc-8cdf-ee433c1cee17\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975751 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/568e1d06-f892-49fc-8cdf-ee433c1cee17-logs" (OuterVolumeSpecName: "logs") pod "568e1d06-f892-49fc-8cdf-ee433c1cee17" (UID: "568e1d06-f892-49fc-8cdf-ee433c1cee17"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975820 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6c34ed0d-a9d2-4b7a-a006-8486a35d5502" (UID: "6c34ed0d-a9d2-4b7a-a006-8486a35d5502"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975377 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-combined-ca-bundle\") pod \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975889 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-logs\") pod \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975916 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fc6b11-e9bf-4886-9036-8276f127b8bf-logs\") pod \"95fc6b11-e9bf-4886-9036-8276f127b8bf\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.975940 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-config-data\") pod \"568e1d06-f892-49fc-8cdf-ee433c1cee17\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976314 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95fc6b11-e9bf-4886-9036-8276f127b8bf-logs" (OuterVolumeSpecName: "logs") pod "95fc6b11-e9bf-4886-9036-8276f127b8bf" (UID: "95fc6b11-e9bf-4886-9036-8276f127b8bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976348 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkq4h\" (UniqueName: \"kubernetes.io/projected/95fc6b11-e9bf-4886-9036-8276f127b8bf-kube-api-access-gkq4h\") pod \"95fc6b11-e9bf-4886-9036-8276f127b8bf\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976442 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-logs" (OuterVolumeSpecName: "logs") pod "6c34ed0d-a9d2-4b7a-a006-8486a35d5502" (UID: "6c34ed0d-a9d2-4b7a-a006-8486a35d5502"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976553 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-config-data" (OuterVolumeSpecName: "config-data") pod "568e1d06-f892-49fc-8cdf-ee433c1cee17" (UID: "568e1d06-f892-49fc-8cdf-ee433c1cee17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976633 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-scripts\") pod \"568e1d06-f892-49fc-8cdf-ee433c1cee17\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976659 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976677 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-config-data\") pod \"95fc6b11-e9bf-4886-9036-8276f127b8bf\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976748 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/568e1d06-f892-49fc-8cdf-ee433c1cee17-horizon-secret-key\") pod \"568e1d06-f892-49fc-8cdf-ee433c1cee17\" (UID: \"568e1d06-f892-49fc-8cdf-ee433c1cee17\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976766 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-scripts\") pod \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976785 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-config-data\") pod \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\" (UID: \"6c34ed0d-a9d2-4b7a-a006-8486a35d5502\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976801 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-scripts\") pod \"95fc6b11-e9bf-4886-9036-8276f127b8bf\" (UID: \"95fc6b11-e9bf-4886-9036-8276f127b8bf\") " Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.976947 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-scripts" (OuterVolumeSpecName: "scripts") pod "568e1d06-f892-49fc-8cdf-ee433c1cee17" (UID: "568e1d06-f892-49fc-8cdf-ee433c1cee17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977222 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/568e1d06-f892-49fc-8cdf-ee433c1cee17-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977234 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977242 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977250 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95fc6b11-e9bf-4886-9036-8276f127b8bf-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977258 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977267 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/568e1d06-f892-49fc-8cdf-ee433c1cee17-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977274 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0e3b11b-9596-49bc-bd3e-11b74b885361-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977281 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0e3b11b-9596-49bc-bd3e-11b74b885361-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977289 4553 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0e3b11b-9596-49bc-bd3e-11b74b885361-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977299 4553 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977307 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw9ls\" (UniqueName: \"kubernetes.io/projected/a0e3b11b-9596-49bc-bd3e-11b74b885361-kube-api-access-gw9ls\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.977615 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-scripts" (OuterVolumeSpecName: "scripts") pod "95fc6b11-e9bf-4886-9036-8276f127b8bf" (UID: "95fc6b11-e9bf-4886-9036-8276f127b8bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.978229 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-config-data" (OuterVolumeSpecName: "config-data") pod "95fc6b11-e9bf-4886-9036-8276f127b8bf" (UID: "95fc6b11-e9bf-4886-9036-8276f127b8bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.982440 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-scripts" (OuterVolumeSpecName: "scripts") pod "6c34ed0d-a9d2-4b7a-a006-8486a35d5502" (UID: "6c34ed0d-a9d2-4b7a-a006-8486a35d5502"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.983792 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95fc6b11-e9bf-4886-9036-8276f127b8bf-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "95fc6b11-e9bf-4886-9036-8276f127b8bf" (UID: "95fc6b11-e9bf-4886-9036-8276f127b8bf"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.984195 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/568e1d06-f892-49fc-8cdf-ee433c1cee17-kube-api-access-vlqnz" (OuterVolumeSpecName: "kube-api-access-vlqnz") pod "568e1d06-f892-49fc-8cdf-ee433c1cee17" (UID: "568e1d06-f892-49fc-8cdf-ee433c1cee17"). InnerVolumeSpecName "kube-api-access-vlqnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.985347 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-kube-api-access-d2nlr" (OuterVolumeSpecName: "kube-api-access-d2nlr") pod "6c34ed0d-a9d2-4b7a-a006-8486a35d5502" (UID: "6c34ed0d-a9d2-4b7a-a006-8486a35d5502"). InnerVolumeSpecName "kube-api-access-d2nlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.985542 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/568e1d06-f892-49fc-8cdf-ee433c1cee17-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "568e1d06-f892-49fc-8cdf-ee433c1cee17" (UID: "568e1d06-f892-49fc-8cdf-ee433c1cee17"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.992119 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fc6b11-e9bf-4886-9036-8276f127b8bf-kube-api-access-gkq4h" (OuterVolumeSpecName: "kube-api-access-gkq4h") pod "95fc6b11-e9bf-4886-9036-8276f127b8bf" (UID: "95fc6b11-e9bf-4886-9036-8276f127b8bf"). InnerVolumeSpecName "kube-api-access-gkq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:32 crc kubenswrapper[4553]: I0930 19:49:32.993915 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "6c34ed0d-a9d2-4b7a-a006-8486a35d5502" (UID: "6c34ed0d-a9d2-4b7a-a006-8486a35d5502"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.020155 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c34ed0d-a9d2-4b7a-a006-8486a35d5502" (UID: "6c34ed0d-a9d2-4b7a-a006-8486a35d5502"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.024211 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-config-data" (OuterVolumeSpecName: "config-data") pod "6c34ed0d-a9d2-4b7a-a006-8486a35d5502" (UID: "6c34ed0d-a9d2-4b7a-a006-8486a35d5502"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078738 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkq4h\" (UniqueName: \"kubernetes.io/projected/95fc6b11-e9bf-4886-9036-8276f127b8bf-kube-api-access-gkq4h\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078815 4553 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078832 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078847 4553 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/568e1d06-f892-49fc-8cdf-ee433c1cee17-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078859 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078870 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078881 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95fc6b11-e9bf-4886-9036-8276f127b8bf-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078892 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlqnz\" (UniqueName: \"kubernetes.io/projected/568e1d06-f892-49fc-8cdf-ee433c1cee17-kube-api-access-vlqnz\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078916 4553 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95fc6b11-e9bf-4886-9036-8276f127b8bf-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078929 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2nlr\" (UniqueName: \"kubernetes.io/projected/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-kube-api-access-d2nlr\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.078941 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c34ed0d-a9d2-4b7a-a006-8486a35d5502-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.094955 4553 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.180380 4553 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.196545 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cbc5c6fc7-8grv2" event={"ID":"568e1d06-f892-49fc-8cdf-ee433c1cee17","Type":"ContainerDied","Data":"d701cd9e1f42243b8b2774a54988d917cd31ffb0125243e6e51bf17907f52245"} Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.196619 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cbc5c6fc7-8grv2" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.200484 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bd6c97f9c-7td27" event={"ID":"a0e3b11b-9596-49bc-bd3e-11b74b885361","Type":"ContainerDied","Data":"634ae2864e052a102eb20e4e63f0c0ba7e1e9f51d819c5c7822817055ed08b09"} Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.200569 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bd6c97f9c-7td27" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.202957 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f698cb877-vqsz6" event={"ID":"95fc6b11-e9bf-4886-9036-8276f127b8bf","Type":"ContainerDied","Data":"c924f01360851f3432d6ec2ee4323179d1a0a1bf888f88edd415e76f5291e0f0"} Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.203190 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f698cb877-vqsz6" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.214507 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c34ed0d-a9d2-4b7a-a006-8486a35d5502","Type":"ContainerDied","Data":"0bc8d426b897921641df1c93ca9a520d21eec2907eb8815a257ac1cfd9c2249d"} Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.214549 4553 scope.go:117] "RemoveContainer" containerID="f14d66411a0bdc931a360b08dfb245604acb636183f17cdd0123b9bb7734cff6" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.214765 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.275686 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cbc5c6fc7-8grv2"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.282452 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6cbc5c6fc7-8grv2"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.343312 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-bd6c97f9c-7td27"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.359462 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-bd6c97f9c-7td27"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.395094 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f698cb877-vqsz6"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.410924 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f698cb877-vqsz6"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.424647 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.430844 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.441192 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:33 crc kubenswrapper[4553]: E0930 19:49:33.441632 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerName="glance-httpd" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.441645 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerName="glance-httpd" Sep 30 19:49:33 crc kubenswrapper[4553]: E0930 19:49:33.441658 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerName="glance-log" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.441664 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerName="glance-log" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.441842 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerName="glance-log" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.441857 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" containerName="glance-httpd" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.442773 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.444787 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.445681 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.449474 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.520413 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="568e1d06-f892-49fc-8cdf-ee433c1cee17" path="/var/lib/kubelet/pods/568e1d06-f892-49fc-8cdf-ee433c1cee17/volumes" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.520848 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c34ed0d-a9d2-4b7a-a006-8486a35d5502" path="/var/lib/kubelet/pods/6c34ed0d-a9d2-4b7a-a006-8486a35d5502/volumes" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.521470 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fc6b11-e9bf-4886-9036-8276f127b8bf" path="/var/lib/kubelet/pods/95fc6b11-e9bf-4886-9036-8276f127b8bf/volumes" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.523013 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e3b11b-9596-49bc-bd3e-11b74b885361" path="/var/lib/kubelet/pods/a0e3b11b-9596-49bc-bd3e-11b74b885361/volumes" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.589849 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.590017 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.590065 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.590099 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.590305 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.590436 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwnts\" (UniqueName: \"kubernetes.io/projected/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-kube-api-access-lwnts\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.590512 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-logs\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.590677 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.692092 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.692496 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.692529 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.692554 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.692589 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.692625 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.692660 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwnts\" (UniqueName: \"kubernetes.io/projected/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-kube-api-access-lwnts\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.692680 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-logs\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.693265 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-logs\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.693989 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.705417 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.705754 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-scripts\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.705875 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.706366 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.711468 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwnts\" (UniqueName: \"kubernetes.io/projected/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-kube-api-access-lwnts\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.713589 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-config-data\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: E0930 19:49:33.732550 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Sep 30 19:49:33 crc kubenswrapper[4553]: E0930 19:49:33.732935 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cplpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-k52t8_openstack(c9958ea9-408e-4b14-8b23-dd1662654cd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:49:33 crc kubenswrapper[4553]: E0930 19:49:33.734633 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-k52t8" podUID="c9958ea9-408e-4b14-8b23-dd1662654cd1" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.750995 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " pod="openstack/glance-default-external-api-0" Sep 30 19:49:33 crc kubenswrapper[4553]: I0930 19:49:33.757477 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:49:34 crc kubenswrapper[4553]: E0930 19:49:34.224600 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-k52t8" podUID="c9958ea9-408e-4b14-8b23-dd1662654cd1" Sep 30 19:49:34 crc kubenswrapper[4553]: I0930 19:49:34.446024 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6mpwk" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Sep 30 19:49:34 crc kubenswrapper[4553]: E0930 19:49:34.977619 4553 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 30 19:49:34 crc kubenswrapper[4553]: E0930 19:49:34.977841 4553 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdh9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-prf67_openstack(04f1abd5-5975-4038-98b3-4b6ff0e858f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 19:49:34 crc kubenswrapper[4553]: E0930 19:49:34.982167 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-prf67" podUID="04f1abd5-5975-4038-98b3-4b6ff0e858f7" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.010216 4553 scope.go:117] "RemoveContainer" containerID="5c1c15be62b603d014e90280125c05ea76a8d65a111433410684c3b49722d842" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.126949 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.141728 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.222694 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-sb\") pod \"7cde397a-0d8e-416d-8ac5-6051a5db9878\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.222742 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-dns-svc\") pod \"7cde397a-0d8e-416d-8ac5-6051a5db9878\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.222813 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ft6c\" (UniqueName: \"kubernetes.io/projected/7cde397a-0d8e-416d-8ac5-6051a5db9878-kube-api-access-6ft6c\") pod \"7cde397a-0d8e-416d-8ac5-6051a5db9878\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.222847 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-config\") pod \"7cde397a-0d8e-416d-8ac5-6051a5db9878\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.222877 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-nb\") pod \"7cde397a-0d8e-416d-8ac5-6051a5db9878\" (UID: \"7cde397a-0d8e-416d-8ac5-6051a5db9878\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.248087 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cde397a-0d8e-416d-8ac5-6051a5db9878-kube-api-access-6ft6c" (OuterVolumeSpecName: "kube-api-access-6ft6c") pod "7cde397a-0d8e-416d-8ac5-6051a5db9878" (UID: "7cde397a-0d8e-416d-8ac5-6051a5db9878"). InnerVolumeSpecName "kube-api-access-6ft6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.268360 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-6mpwk" event={"ID":"7cde397a-0d8e-416d-8ac5-6051a5db9878","Type":"ContainerDied","Data":"44c762c2a9f38e95872a1b169753aab21b3e0f3d447efafabf65db1bc7f45fb0"} Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.268410 4553 scope.go:117] "RemoveContainer" containerID="42d2955b10ec7574c2f79a48f67fd3defe5c6fca5fab753f3f084a8aab1730c0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.268500 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-6mpwk" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.279238 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3b3f976-3c25-4141-b586-2f06390b1a7a","Type":"ContainerDied","Data":"f51cb0446b3a84dc865387d5f4ec80caa9eb9232d47d43838a8cc90d1e138ac8"} Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.279251 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: E0930 19:49:35.281556 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-prf67" podUID="04f1abd5-5975-4038-98b3-4b6ff0e858f7" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.288180 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7cde397a-0d8e-416d-8ac5-6051a5db9878" (UID: "7cde397a-0d8e-416d-8ac5-6051a5db9878"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.292186 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7cde397a-0d8e-416d-8ac5-6051a5db9878" (UID: "7cde397a-0d8e-416d-8ac5-6051a5db9878"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.314524 4553 scope.go:117] "RemoveContainer" containerID="9d36972fe3d8d64d99acbe0d9410686f467cb4167cb00a1212af00914d6680b2" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.320426 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7cde397a-0d8e-416d-8ac5-6051a5db9878" (UID: "7cde397a-0d8e-416d-8ac5-6051a5db9878"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.324538 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-logs\") pod \"c3b3f976-3c25-4141-b586-2f06390b1a7a\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.324609 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c3b3f976-3c25-4141-b586-2f06390b1a7a\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.324695 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-httpd-run\") pod \"c3b3f976-3c25-4141-b586-2f06390b1a7a\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.324743 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk2nt\" (UniqueName: \"kubernetes.io/projected/c3b3f976-3c25-4141-b586-2f06390b1a7a-kube-api-access-jk2nt\") pod \"c3b3f976-3c25-4141-b586-2f06390b1a7a\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.324773 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-combined-ca-bundle\") pod \"c3b3f976-3c25-4141-b586-2f06390b1a7a\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.324793 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-scripts\") pod \"c3b3f976-3c25-4141-b586-2f06390b1a7a\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.324846 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-config-data\") pod \"c3b3f976-3c25-4141-b586-2f06390b1a7a\" (UID: \"c3b3f976-3c25-4141-b586-2f06390b1a7a\") " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.325208 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.325226 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.325238 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ft6c\" (UniqueName: \"kubernetes.io/projected/7cde397a-0d8e-416d-8ac5-6051a5db9878-kube-api-access-6ft6c\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.325258 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.326441 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c3b3f976-3c25-4141-b586-2f06390b1a7a" (UID: "c3b3f976-3c25-4141-b586-2f06390b1a7a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.326498 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-logs" (OuterVolumeSpecName: "logs") pod "c3b3f976-3c25-4141-b586-2f06390b1a7a" (UID: "c3b3f976-3c25-4141-b586-2f06390b1a7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.330158 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-scripts" (OuterVolumeSpecName: "scripts") pod "c3b3f976-3c25-4141-b586-2f06390b1a7a" (UID: "c3b3f976-3c25-4141-b586-2f06390b1a7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.330179 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b3f976-3c25-4141-b586-2f06390b1a7a-kube-api-access-jk2nt" (OuterVolumeSpecName: "kube-api-access-jk2nt") pod "c3b3f976-3c25-4141-b586-2f06390b1a7a" (UID: "c3b3f976-3c25-4141-b586-2f06390b1a7a"). InnerVolumeSpecName "kube-api-access-jk2nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.334125 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "c3b3f976-3c25-4141-b586-2f06390b1a7a" (UID: "c3b3f976-3c25-4141-b586-2f06390b1a7a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.336265 4553 scope.go:117] "RemoveContainer" containerID="c77b0d6e720966437e74019f162076e7fcc1d5b7ea79aa40372565c330f7407f" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.339497 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-config" (OuterVolumeSpecName: "config") pod "7cde397a-0d8e-416d-8ac5-6051a5db9878" (UID: "7cde397a-0d8e-416d-8ac5-6051a5db9878"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.362779 4553 scope.go:117] "RemoveContainer" containerID="1f48e20b8965f6f40e7fa346cf6d36459b18112ea605dad8898760d2f9807886" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.370457 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3b3f976-3c25-4141-b586-2f06390b1a7a" (UID: "c3b3f976-3c25-4141-b586-2f06390b1a7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.393642 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-config-data" (OuterVolumeSpecName: "config-data") pod "c3b3f976-3c25-4141-b586-2f06390b1a7a" (UID: "c3b3f976-3c25-4141-b586-2f06390b1a7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.426405 4553 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.426434 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cde397a-0d8e-416d-8ac5-6051a5db9878-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.426446 4553 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.426455 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk2nt\" (UniqueName: \"kubernetes.io/projected/c3b3f976-3c25-4141-b586-2f06390b1a7a-kube-api-access-jk2nt\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.426464 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.426474 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.426481 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b3f976-3c25-4141-b586-2f06390b1a7a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.426489 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b3f976-3c25-4141-b586-2f06390b1a7a-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.446121 4553 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.528121 4553 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.546776 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84c849768b-8k9mh"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.551106 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-868c6b469d-rhw7t"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.590756 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6mpwk"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.602436 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-6mpwk"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.608292 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.613704 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.651628 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:35 crc kubenswrapper[4553]: E0930 19:49:35.651946 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerName="glance-httpd" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.651965 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerName="glance-httpd" Sep 30 19:49:35 crc kubenswrapper[4553]: E0930 19:49:35.651979 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.652001 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" Sep 30 19:49:35 crc kubenswrapper[4553]: E0930 19:49:35.652017 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="init" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.652023 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="init" Sep 30 19:49:35 crc kubenswrapper[4553]: E0930 19:49:35.652051 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerName="glance-log" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.652057 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerName="glance-log" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.652207 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.652225 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerName="glance-log" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.652236 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" containerName="glance-httpd" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.653023 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.658646 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.658851 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.679288 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.730074 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pk229"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.735858 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.735997 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czfrm\" (UniqueName: \"kubernetes.io/projected/8b75f5b7-0080-4f75-9012-c89c87d08202-kube-api-access-czfrm\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.736056 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.736080 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.736120 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.736150 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.736173 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.736219 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.845426 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czfrm\" (UniqueName: \"kubernetes.io/projected/8b75f5b7-0080-4f75-9012-c89c87d08202-kube-api-access-czfrm\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.845768 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.845791 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.845817 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.845851 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.845887 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.845920 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.845981 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.848807 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.849389 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-logs\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.849653 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.863126 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.863579 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.884020 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.886545 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.887107 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czfrm\" (UniqueName: \"kubernetes.io/projected/8b75f5b7-0080-4f75-9012-c89c87d08202-kube-api-access-czfrm\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.914831 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.980256 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:49:35 crc kubenswrapper[4553]: I0930 19:49:35.981098 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:36 crc kubenswrapper[4553]: I0930 19:49:36.322109 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pk229" event={"ID":"2633e01b-c518-4077-af93-7ba213150186","Type":"ContainerStarted","Data":"5b3b48e5cc114e37b82b361a145372fc813009d0a4276f9be4bd1c815092a7b5"} Sep 30 19:49:36 crc kubenswrapper[4553]: I0930 19:49:36.322310 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pk229" event={"ID":"2633e01b-c518-4077-af93-7ba213150186","Type":"ContainerStarted","Data":"526f6bb70600fb412c641704abaddf4f467f29c56437e8cd9c4e4d541926ecfa"} Sep 30 19:49:36 crc kubenswrapper[4553]: I0930 19:49:36.350153 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125","Type":"ContainerStarted","Data":"3946f228f64355eeb6ceb0b10cd22babf305c008dd3c43635c21b3efe1c2e95d"} Sep 30 19:49:36 crc kubenswrapper[4553]: I0930 19:49:36.353883 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868c6b469d-rhw7t" event={"ID":"849f4ec8-2741-4c83-82d8-135a24b43447","Type":"ContainerStarted","Data":"aa044d97f5410fd4bbcf32a7f05d67e6027d5dd964cc54a99c05095350722a03"} Sep 30 19:49:36 crc kubenswrapper[4553]: I0930 19:49:36.365255 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c849768b-8k9mh" event={"ID":"17921f25-bee1-4e2e-a9e2-50669133664e","Type":"ContainerStarted","Data":"8ed9ddd1b071890d8c64603020476049c182c88d421c1634793e232667dd10a9"} Sep 30 19:49:36 crc kubenswrapper[4553]: I0930 19:49:36.373189 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerStarted","Data":"eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14"} Sep 30 19:49:36 crc kubenswrapper[4553]: I0930 19:49:36.524278 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pk229" podStartSLOduration=14.524258213 podStartE2EDuration="14.524258213s" podCreationTimestamp="2025-09-30 19:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:36.344398457 +0000 UTC m=+1029.543900587" watchObservedRunningTime="2025-09-30 19:49:36.524258213 +0000 UTC m=+1029.723760343" Sep 30 19:49:36 crc kubenswrapper[4553]: I0930 19:49:36.624673 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:49:36 crc kubenswrapper[4553]: W0930 19:49:36.673560 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b75f5b7_0080_4f75_9012_c89c87d08202.slice/crio-1d432502db72c94df0b6fe2abe249d7c72155fcd3ca318a75f1c6d8da0a64be5 WatchSource:0}: Error finding container 1d432502db72c94df0b6fe2abe249d7c72155fcd3ca318a75f1c6d8da0a64be5: Status 404 returned error can't find the container with id 1d432502db72c94df0b6fe2abe249d7c72155fcd3ca318a75f1c6d8da0a64be5 Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.400601 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c849768b-8k9mh" event={"ID":"17921f25-bee1-4e2e-a9e2-50669133664e","Type":"ContainerStarted","Data":"433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d"} Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.400985 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c849768b-8k9mh" event={"ID":"17921f25-bee1-4e2e-a9e2-50669133664e","Type":"ContainerStarted","Data":"865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7"} Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.418330 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b75f5b7-0080-4f75-9012-c89c87d08202","Type":"ContainerStarted","Data":"1d432502db72c94df0b6fe2abe249d7c72155fcd3ca318a75f1c6d8da0a64be5"} Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.430310 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerStarted","Data":"56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f"} Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.433622 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125","Type":"ContainerStarted","Data":"5245f0f62b37906870ac327c579cdf7896ac5667eec53691486f374e74c80ba6"} Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.437421 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868c6b469d-rhw7t" event={"ID":"849f4ec8-2741-4c83-82d8-135a24b43447","Type":"ContainerStarted","Data":"8ff81ea747885714e105c6375ebce9a46010eb768beacf38f8bb8b073a927352"} Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.437453 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-868c6b469d-rhw7t" event={"ID":"849f4ec8-2741-4c83-82d8-135a24b43447","Type":"ContainerStarted","Data":"cf79aa0f1e7c206524e1f8a5e9082d9ddb7cdc608628261c9dbfa3ad2342808e"} Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.465901 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-868c6b469d-rhw7t" podStartSLOduration=25.941096296 podStartE2EDuration="26.465885384s" podCreationTimestamp="2025-09-30 19:49:11 +0000 UTC" firstStartedPulling="2025-09-30 19:49:35.552961043 +0000 UTC m=+1028.752463174" lastFinishedPulling="2025-09-30 19:49:36.077750132 +0000 UTC m=+1029.277252262" observedRunningTime="2025-09-30 19:49:37.46576159 +0000 UTC m=+1030.665263720" watchObservedRunningTime="2025-09-30 19:49:37.465885384 +0000 UTC m=+1030.665387514" Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.466774 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84c849768b-8k9mh" podStartSLOduration=25.911205898 podStartE2EDuration="26.466767187s" podCreationTimestamp="2025-09-30 19:49:11 +0000 UTC" firstStartedPulling="2025-09-30 19:49:35.553566449 +0000 UTC m=+1028.753068579" lastFinishedPulling="2025-09-30 19:49:36.109127738 +0000 UTC m=+1029.308629868" observedRunningTime="2025-09-30 19:49:37.43047319 +0000 UTC m=+1030.629975310" watchObservedRunningTime="2025-09-30 19:49:37.466767187 +0000 UTC m=+1030.666269317" Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.523554 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" path="/var/lib/kubelet/pods/7cde397a-0d8e-416d-8ac5-6051a5db9878/volumes" Sep 30 19:49:37 crc kubenswrapper[4553]: I0930 19:49:37.524463 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b3f976-3c25-4141-b586-2f06390b1a7a" path="/var/lib/kubelet/pods/c3b3f976-3c25-4141-b586-2f06390b1a7a/volumes" Sep 30 19:49:38 crc kubenswrapper[4553]: I0930 19:49:38.447824 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125","Type":"ContainerStarted","Data":"27bcd308b1b3b96ab1caae008d19f9236f5222fe9841d2ca6606cb6543edbaad"} Sep 30 19:49:38 crc kubenswrapper[4553]: I0930 19:49:38.451206 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b75f5b7-0080-4f75-9012-c89c87d08202","Type":"ContainerStarted","Data":"1886f660e393872d040f7bb9a8675d9af5153af6883480cd52e1543eff8c3e7e"} Sep 30 19:49:38 crc kubenswrapper[4553]: I0930 19:49:38.453623 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zsslz" event={"ID":"08c9fecb-7dc9-4aed-b134-98995f1cf280","Type":"ContainerStarted","Data":"fdb76c3419c546d756a94229f0a6eb6009114b7423d2675741c8614ce922dcad"} Sep 30 19:49:38 crc kubenswrapper[4553]: I0930 19:49:38.487970 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.487952271 podStartE2EDuration="5.487952271s" podCreationTimestamp="2025-09-30 19:49:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:38.470188882 +0000 UTC m=+1031.669691012" watchObservedRunningTime="2025-09-30 19:49:38.487952271 +0000 UTC m=+1031.687454391" Sep 30 19:49:38 crc kubenswrapper[4553]: I0930 19:49:38.510543 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zsslz" podStartSLOduration=3.713205175 podStartE2EDuration="39.510523049s" podCreationTimestamp="2025-09-30 19:48:59 +0000 UTC" firstStartedPulling="2025-09-30 19:49:01.366119391 +0000 UTC m=+994.565621521" lastFinishedPulling="2025-09-30 19:49:37.163437275 +0000 UTC m=+1030.362939395" observedRunningTime="2025-09-30 19:49:38.499547983 +0000 UTC m=+1031.699050113" watchObservedRunningTime="2025-09-30 19:49:38.510523049 +0000 UTC m=+1031.710025179" Sep 30 19:49:39 crc kubenswrapper[4553]: I0930 19:49:39.447150 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-6mpwk" podUID="7cde397a-0d8e-416d-8ac5-6051a5db9878" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Sep 30 19:49:39 crc kubenswrapper[4553]: I0930 19:49:39.467094 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b75f5b7-0080-4f75-9012-c89c87d08202","Type":"ContainerStarted","Data":"fae9530c48673a42c1b2c5cff236adc67b416ff8342ccaeb00fc52112e0c160e"} Sep 30 19:49:39 crc kubenswrapper[4553]: I0930 19:49:39.482587 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.482576129 podStartE2EDuration="4.482576129s" podCreationTimestamp="2025-09-30 19:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:39.480913405 +0000 UTC m=+1032.680415535" watchObservedRunningTime="2025-09-30 19:49:39.482576129 +0000 UTC m=+1032.682078259" Sep 30 19:49:40 crc kubenswrapper[4553]: I0930 19:49:40.474458 4553 generic.go:334] "Generic (PLEG): container finished" podID="2633e01b-c518-4077-af93-7ba213150186" containerID="5b3b48e5cc114e37b82b361a145372fc813009d0a4276f9be4bd1c815092a7b5" exitCode=0 Sep 30 19:49:40 crc kubenswrapper[4553]: I0930 19:49:40.474804 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pk229" event={"ID":"2633e01b-c518-4077-af93-7ba213150186","Type":"ContainerDied","Data":"5b3b48e5cc114e37b82b361a145372fc813009d0a4276f9be4bd1c815092a7b5"} Sep 30 19:49:40 crc kubenswrapper[4553]: I0930 19:49:40.479880 4553 generic.go:334] "Generic (PLEG): container finished" podID="d1bf2fc0-8737-4258-9bf8-1978001043f9" containerID="c35f71ed62ab9c4849e25d2f14da54779822bd9438fec58dbc0c4ac04c0373ed" exitCode=0 Sep 30 19:49:40 crc kubenswrapper[4553]: I0930 19:49:40.480836 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f7rgm" event={"ID":"d1bf2fc0-8737-4258-9bf8-1978001043f9","Type":"ContainerDied","Data":"c35f71ed62ab9c4849e25d2f14da54779822bd9438fec58dbc0c4ac04c0373ed"} Sep 30 19:49:41 crc kubenswrapper[4553]: I0930 19:49:41.427048 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:41 crc kubenswrapper[4553]: I0930 19:49:41.427173 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:49:41 crc kubenswrapper[4553]: I0930 19:49:41.493333 4553 generic.go:334] "Generic (PLEG): container finished" podID="08c9fecb-7dc9-4aed-b134-98995f1cf280" containerID="fdb76c3419c546d756a94229f0a6eb6009114b7423d2675741c8614ce922dcad" exitCode=0 Sep 30 19:49:41 crc kubenswrapper[4553]: I0930 19:49:41.493462 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zsslz" event={"ID":"08c9fecb-7dc9-4aed-b134-98995f1cf280","Type":"ContainerDied","Data":"fdb76c3419c546d756a94229f0a6eb6009114b7423d2675741c8614ce922dcad"} Sep 30 19:49:41 crc kubenswrapper[4553]: I0930 19:49:41.554401 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:41 crc kubenswrapper[4553]: I0930 19:49:41.555157 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:49:42 crc kubenswrapper[4553]: I0930 19:49:42.980008 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zsslz" Sep 30 19:49:42 crc kubenswrapper[4553]: I0930 19:49:42.987422 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.002782 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106232 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-fernet-keys\") pod \"2633e01b-c518-4077-af93-7ba213150186\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106516 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-config\") pod \"d1bf2fc0-8737-4258-9bf8-1978001043f9\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106535 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-scripts\") pod \"08c9fecb-7dc9-4aed-b134-98995f1cf280\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106554 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-scripts\") pod \"2633e01b-c518-4077-af93-7ba213150186\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106610 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-config-data\") pod \"08c9fecb-7dc9-4aed-b134-98995f1cf280\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106648 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-credential-keys\") pod \"2633e01b-c518-4077-af93-7ba213150186\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106669 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-combined-ca-bundle\") pod \"2633e01b-c518-4077-af93-7ba213150186\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106691 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-combined-ca-bundle\") pod \"08c9fecb-7dc9-4aed-b134-98995f1cf280\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106716 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt5fx\" (UniqueName: \"kubernetes.io/projected/08c9fecb-7dc9-4aed-b134-98995f1cf280-kube-api-access-lt5fx\") pod \"08c9fecb-7dc9-4aed-b134-98995f1cf280\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106736 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg5sr\" (UniqueName: \"kubernetes.io/projected/d1bf2fc0-8737-4258-9bf8-1978001043f9-kube-api-access-dg5sr\") pod \"d1bf2fc0-8737-4258-9bf8-1978001043f9\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106753 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-combined-ca-bundle\") pod \"d1bf2fc0-8737-4258-9bf8-1978001043f9\" (UID: \"d1bf2fc0-8737-4258-9bf8-1978001043f9\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106776 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c9fecb-7dc9-4aed-b134-98995f1cf280-logs\") pod \"08c9fecb-7dc9-4aed-b134-98995f1cf280\" (UID: \"08c9fecb-7dc9-4aed-b134-98995f1cf280\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106831 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq7zj\" (UniqueName: \"kubernetes.io/projected/2633e01b-c518-4077-af93-7ba213150186-kube-api-access-cq7zj\") pod \"2633e01b-c518-4077-af93-7ba213150186\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.106857 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-config-data\") pod \"2633e01b-c518-4077-af93-7ba213150186\" (UID: \"2633e01b-c518-4077-af93-7ba213150186\") " Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.111141 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c9fecb-7dc9-4aed-b134-98995f1cf280-logs" (OuterVolumeSpecName: "logs") pod "08c9fecb-7dc9-4aed-b134-98995f1cf280" (UID: "08c9fecb-7dc9-4aed-b134-98995f1cf280"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.118399 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2633e01b-c518-4077-af93-7ba213150186" (UID: "2633e01b-c518-4077-af93-7ba213150186"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.120105 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2633e01b-c518-4077-af93-7ba213150186-kube-api-access-cq7zj" (OuterVolumeSpecName: "kube-api-access-cq7zj") pod "2633e01b-c518-4077-af93-7ba213150186" (UID: "2633e01b-c518-4077-af93-7ba213150186"). InnerVolumeSpecName "kube-api-access-cq7zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.128675 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1bf2fc0-8737-4258-9bf8-1978001043f9-kube-api-access-dg5sr" (OuterVolumeSpecName: "kube-api-access-dg5sr") pod "d1bf2fc0-8737-4258-9bf8-1978001043f9" (UID: "d1bf2fc0-8737-4258-9bf8-1978001043f9"). InnerVolumeSpecName "kube-api-access-dg5sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.129223 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c9fecb-7dc9-4aed-b134-98995f1cf280-kube-api-access-lt5fx" (OuterVolumeSpecName: "kube-api-access-lt5fx") pod "08c9fecb-7dc9-4aed-b134-98995f1cf280" (UID: "08c9fecb-7dc9-4aed-b134-98995f1cf280"). InnerVolumeSpecName "kube-api-access-lt5fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.130630 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2633e01b-c518-4077-af93-7ba213150186" (UID: "2633e01b-c518-4077-af93-7ba213150186"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.130652 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-scripts" (OuterVolumeSpecName: "scripts") pod "08c9fecb-7dc9-4aed-b134-98995f1cf280" (UID: "08c9fecb-7dc9-4aed-b134-98995f1cf280"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.130771 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-scripts" (OuterVolumeSpecName: "scripts") pod "2633e01b-c518-4077-af93-7ba213150186" (UID: "2633e01b-c518-4077-af93-7ba213150186"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.147962 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1bf2fc0-8737-4258-9bf8-1978001043f9" (UID: "d1bf2fc0-8737-4258-9bf8-1978001043f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.161912 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2633e01b-c518-4077-af93-7ba213150186" (UID: "2633e01b-c518-4077-af93-7ba213150186"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.173337 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-config" (OuterVolumeSpecName: "config") pod "d1bf2fc0-8737-4258-9bf8-1978001043f9" (UID: "d1bf2fc0-8737-4258-9bf8-1978001043f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.181663 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-config-data" (OuterVolumeSpecName: "config-data") pod "08c9fecb-7dc9-4aed-b134-98995f1cf280" (UID: "08c9fecb-7dc9-4aed-b134-98995f1cf280"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.181851 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-config-data" (OuterVolumeSpecName: "config-data") pod "2633e01b-c518-4077-af93-7ba213150186" (UID: "2633e01b-c518-4077-af93-7ba213150186"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.185184 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c9fecb-7dc9-4aed-b134-98995f1cf280" (UID: "08c9fecb-7dc9-4aed-b134-98995f1cf280"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.208940 4553 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.208966 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.208975 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.208985 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.208992 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209000 4553 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209011 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209019 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c9fecb-7dc9-4aed-b134-98995f1cf280-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209028 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt5fx\" (UniqueName: \"kubernetes.io/projected/08c9fecb-7dc9-4aed-b134-98995f1cf280-kube-api-access-lt5fx\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209090 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg5sr\" (UniqueName: \"kubernetes.io/projected/d1bf2fc0-8737-4258-9bf8-1978001043f9-kube-api-access-dg5sr\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209099 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bf2fc0-8737-4258-9bf8-1978001043f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209108 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c9fecb-7dc9-4aed-b134-98995f1cf280-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209117 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq7zj\" (UniqueName: \"kubernetes.io/projected/2633e01b-c518-4077-af93-7ba213150186-kube-api-access-cq7zj\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.209125 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2633e01b-c518-4077-af93-7ba213150186-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.524952 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pk229" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.544794 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerStarted","Data":"c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716"} Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.544855 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pk229" event={"ID":"2633e01b-c518-4077-af93-7ba213150186","Type":"ContainerDied","Data":"526f6bb70600fb412c641704abaddf4f467f29c56437e8cd9c4e4d541926ecfa"} Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.545007 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="526f6bb70600fb412c641704abaddf4f467f29c56437e8cd9c4e4d541926ecfa" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.557921 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f7rgm" event={"ID":"d1bf2fc0-8737-4258-9bf8-1978001043f9","Type":"ContainerDied","Data":"414def7d452087bf14dbebdc33668a21c43ba73d7e77ec8355146aeb3247bb57"} Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.557975 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="414def7d452087bf14dbebdc33668a21c43ba73d7e77ec8355146aeb3247bb57" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.558099 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f7rgm" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.560687 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zsslz" event={"ID":"08c9fecb-7dc9-4aed-b134-98995f1cf280","Type":"ContainerDied","Data":"a99f92f2f006a208b306cc216ad4ef90c673dd3adf9b3adc3a9bc73bc016b7a1"} Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.560726 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a99f92f2f006a208b306cc216ad4ef90c673dd3adf9b3adc3a9bc73bc016b7a1" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.560794 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zsslz" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.715130 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-89c84b54-dlsmt"] Sep 30 19:49:43 crc kubenswrapper[4553]: E0930 19:49:43.715438 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2633e01b-c518-4077-af93-7ba213150186" containerName="keystone-bootstrap" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.715454 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2633e01b-c518-4077-af93-7ba213150186" containerName="keystone-bootstrap" Sep 30 19:49:43 crc kubenswrapper[4553]: E0930 19:49:43.715488 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1bf2fc0-8737-4258-9bf8-1978001043f9" containerName="neutron-db-sync" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.715494 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1bf2fc0-8737-4258-9bf8-1978001043f9" containerName="neutron-db-sync" Sep 30 19:49:43 crc kubenswrapper[4553]: E0930 19:49:43.715508 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c9fecb-7dc9-4aed-b134-98995f1cf280" containerName="placement-db-sync" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.715514 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c9fecb-7dc9-4aed-b134-98995f1cf280" containerName="placement-db-sync" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.715672 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="2633e01b-c518-4077-af93-7ba213150186" containerName="keystone-bootstrap" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.715695 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1bf2fc0-8737-4258-9bf8-1978001043f9" containerName="neutron-db-sync" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.715705 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c9fecb-7dc9-4aed-b134-98995f1cf280" containerName="placement-db-sync" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.716553 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.718799 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qq8m2" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.719010 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.719067 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.719247 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.730588 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.735067 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-89c84b54-dlsmt"] Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.757842 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.758456 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.817390 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.818497 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-scripts\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.818528 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-internal-tls-certs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.818678 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-public-tls-certs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.818907 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlr8\" (UniqueName: \"kubernetes.io/projected/a4059329-d42b-4d54-b952-feb9f5bd53b6-kube-api-access-fjlr8\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.819279 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4059329-d42b-4d54-b952-feb9f5bd53b6-logs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.819508 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.819582 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-config-data\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.819624 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-combined-ca-bundle\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.921570 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4059329-d42b-4d54-b952-feb9f5bd53b6-logs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.921846 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-config-data\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.921865 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-combined-ca-bundle\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.921881 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-scripts\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.921899 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-internal-tls-certs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.921943 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-public-tls-certs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.922006 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlr8\" (UniqueName: \"kubernetes.io/projected/a4059329-d42b-4d54-b952-feb9f5bd53b6-kube-api-access-fjlr8\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.922162 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4059329-d42b-4d54-b952-feb9f5bd53b6-logs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.927721 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-config-data\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.928174 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-public-tls-certs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.929484 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-scripts\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.933485 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-internal-tls-certs\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.934980 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4059329-d42b-4d54-b952-feb9f5bd53b6-combined-ca-bundle\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:43 crc kubenswrapper[4553]: I0930 19:49:43.947820 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlr8\" (UniqueName: \"kubernetes.io/projected/a4059329-d42b-4d54-b952-feb9f5bd53b6-kube-api-access-fjlr8\") pod \"placement-89c84b54-dlsmt\" (UID: \"a4059329-d42b-4d54-b952-feb9f5bd53b6\") " pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.030833 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.118389 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-578c97db4-g464k"] Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.126130 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.140225 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.140456 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.140584 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.140834 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hswpl" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.140940 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.142378 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.167681 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-578c97db4-g464k"] Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.203686 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-credential-keys\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.203758 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6x6k\" (UniqueName: \"kubernetes.io/projected/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-kube-api-access-h6x6k\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.203806 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-public-tls-certs\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.203836 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-fernet-keys\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.203875 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-config-data\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.203906 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-scripts\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.203938 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-combined-ca-bundle\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.203960 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-internal-tls-certs\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.296558 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hrvtc"] Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.298317 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.304924 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-scripts\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.305102 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-combined-ca-bundle\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.305215 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-internal-tls-certs\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.305305 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-credential-keys\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.305402 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6x6k\" (UniqueName: \"kubernetes.io/projected/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-kube-api-access-h6x6k\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.305490 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-public-tls-certs\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.305557 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-fernet-keys\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.305645 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-config-data\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.319951 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-scripts\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.321855 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hrvtc"] Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.327021 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-combined-ca-bundle\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.327964 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-fernet-keys\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.331182 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-config-data\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.334610 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-public-tls-certs\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.338833 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-credential-keys\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.341122 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-internal-tls-certs\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.348077 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6x6k\" (UniqueName: \"kubernetes.io/projected/fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5-kube-api-access-h6x6k\") pod \"keystone-578c97db4-g464k\" (UID: \"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5\") " pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.411355 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.411416 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.411479 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.411521 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.411586 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fht2z\" (UniqueName: \"kubernetes.io/projected/57ddc390-73c9-44d4-941d-63f506633035-kube-api-access-fht2z\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.411609 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-config\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: W0930 19:49:44.463969 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4059329_d42b_4d54_b952_feb9f5bd53b6.slice/crio-041731be4c2124df1e820032ba9f2688f96afc246f5360169ff19cfe0beea2f2 WatchSource:0}: Error finding container 041731be4c2124df1e820032ba9f2688f96afc246f5360169ff19cfe0beea2f2: Status 404 returned error can't find the container with id 041731be4c2124df1e820032ba9f2688f96afc246f5360169ff19cfe0beea2f2 Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.474798 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-89c84b54-dlsmt"] Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.485095 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.513892 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fht2z\" (UniqueName: \"kubernetes.io/projected/57ddc390-73c9-44d4-941d-63f506633035-kube-api-access-fht2z\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.514133 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-config\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.514236 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.514322 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.514414 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.514504 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.515065 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-config\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.515481 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.516425 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.516884 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.517214 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.533900 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-788545c8bb-gjsrq"] Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.535615 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.542839 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fht2z\" (UniqueName: \"kubernetes.io/projected/57ddc390-73c9-44d4-941d-63f506633035-kube-api-access-fht2z\") pod \"dnsmasq-dns-55f844cf75-hrvtc\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.543221 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sl72r" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.543445 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.543606 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.545447 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-788545c8bb-gjsrq"] Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.553814 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.592124 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89c84b54-dlsmt" event={"ID":"a4059329-d42b-4d54-b952-feb9f5bd53b6","Type":"ContainerStarted","Data":"041731be4c2124df1e820032ba9f2688f96afc246f5360169ff19cfe0beea2f2"} Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.592460 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.592535 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.615579 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7sdp\" (UniqueName: \"kubernetes.io/projected/f6470fe1-f2c0-454b-a534-b183258da4f3-kube-api-access-l7sdp\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.615631 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-ovndb-tls-certs\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.615657 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-httpd-config\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.615691 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-combined-ca-bundle\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.615746 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-config\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.639604 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.718168 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-config\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.719451 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7sdp\" (UniqueName: \"kubernetes.io/projected/f6470fe1-f2c0-454b-a534-b183258da4f3-kube-api-access-l7sdp\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.719498 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-ovndb-tls-certs\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.719518 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-httpd-config\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.719572 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-combined-ca-bundle\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.733508 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-combined-ca-bundle\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.733994 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-httpd-config\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.734592 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-config\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.739781 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-ovndb-tls-certs\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.740447 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7sdp\" (UniqueName: \"kubernetes.io/projected/f6470fe1-f2c0-454b-a534-b183258da4f3-kube-api-access-l7sdp\") pod \"neutron-788545c8bb-gjsrq\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:44 crc kubenswrapper[4553]: I0930 19:49:44.888241 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.212897 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-578c97db4-g464k"] Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.356817 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hrvtc"] Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.600589 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" event={"ID":"57ddc390-73c9-44d4-941d-63f506633035","Type":"ContainerStarted","Data":"7def9f9829a84ad83184aff086ed82a1a1573db1a2eb8b45fce5f9c0bfb7b529"} Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.614313 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89c84b54-dlsmt" event={"ID":"a4059329-d42b-4d54-b952-feb9f5bd53b6","Type":"ContainerStarted","Data":"772c65fc92ed92dd83e0d383cc7bf5deb9c32c19914bf83da3b63a6e894cac1a"} Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.614349 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-89c84b54-dlsmt" event={"ID":"a4059329-d42b-4d54-b952-feb9f5bd53b6","Type":"ContainerStarted","Data":"e590ed767abfec6322f259486662059a6acf62c0b9b50ab2f2c5700b8861de61"} Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.614381 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.614401 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.618441 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-788545c8bb-gjsrq"] Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.618679 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-578c97db4-g464k" event={"ID":"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5","Type":"ContainerStarted","Data":"cfe24a95467fbb4bbbfc8e1ea5997dabf10e3bfef17083a452758c0b34a8a3eb"} Sep 30 19:49:45 crc kubenswrapper[4553]: W0930 19:49:45.628174 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6470fe1_f2c0_454b_a534_b183258da4f3.slice/crio-f73f03b3cc48db377b74a4c0319380b2fba329c7797c741a264f0f1693622c5a WatchSource:0}: Error finding container f73f03b3cc48db377b74a4c0319380b2fba329c7797c741a264f0f1693622c5a: Status 404 returned error can't find the container with id f73f03b3cc48db377b74a4c0319380b2fba329c7797c741a264f0f1693622c5a Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.639653 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-89c84b54-dlsmt" podStartSLOduration=2.63962493 podStartE2EDuration="2.63962493s" podCreationTimestamp="2025-09-30 19:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:45.632331243 +0000 UTC m=+1038.831833373" watchObservedRunningTime="2025-09-30 19:49:45.63962493 +0000 UTC m=+1038.839127060" Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.982218 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:45 crc kubenswrapper[4553]: I0930 19:49:45.982289 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:46 crc kubenswrapper[4553]: I0930 19:49:46.042646 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:46 crc kubenswrapper[4553]: I0930 19:49:46.044359 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:46 crc kubenswrapper[4553]: I0930 19:49:46.653151 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-788545c8bb-gjsrq" event={"ID":"f6470fe1-f2c0-454b-a534-b183258da4f3","Type":"ContainerStarted","Data":"f73f03b3cc48db377b74a4c0319380b2fba329c7797c741a264f0f1693622c5a"} Sep 30 19:49:46 crc kubenswrapper[4553]: I0930 19:49:46.653794 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:46 crc kubenswrapper[4553]: I0930 19:49:46.653890 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:49:46 crc kubenswrapper[4553]: I0930 19:49:46.653898 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:49:46 crc kubenswrapper[4553]: I0930 19:49:46.655096 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.658498 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-578c97db4-g464k" event={"ID":"fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5","Type":"ContainerStarted","Data":"ed5b2c6468e53060fa55e2d92245bd6d78170e17d7593c29b785ef9a1d74baca"} Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.658953 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-578c97db4-g464k" Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.661461 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-788545c8bb-gjsrq" event={"ID":"f6470fe1-f2c0-454b-a534-b183258da4f3","Type":"ContainerStarted","Data":"4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0"} Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.661485 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-788545c8bb-gjsrq" event={"ID":"f6470fe1-f2c0-454b-a534-b183258da4f3","Type":"ContainerStarted","Data":"47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86"} Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.661496 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.663196 4553 generic.go:334] "Generic (PLEG): container finished" podID="57ddc390-73c9-44d4-941d-63f506633035" containerID="49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e" exitCode=0 Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.664066 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" event={"ID":"57ddc390-73c9-44d4-941d-63f506633035","Type":"ContainerDied","Data":"49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e"} Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.703398 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-578c97db4-g464k" podStartSLOduration=3.703378544 podStartE2EDuration="3.703378544s" podCreationTimestamp="2025-09-30 19:49:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:47.697357941 +0000 UTC m=+1040.896860071" watchObservedRunningTime="2025-09-30 19:49:47.703378544 +0000 UTC m=+1040.902880674" Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.834872 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69bfb64645-4wbwh"] Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.836473 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.844603 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.845069 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.894175 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69bfb64645-4wbwh"] Sep 30 19:49:47 crc kubenswrapper[4553]: I0930 19:49:47.911121 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-788545c8bb-gjsrq" podStartSLOduration=3.91109942 podStartE2EDuration="3.91109942s" podCreationTimestamp="2025-09-30 19:49:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:47.828902586 +0000 UTC m=+1041.028404716" watchObservedRunningTime="2025-09-30 19:49:47.91109942 +0000 UTC m=+1041.110601550" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.024658 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-config\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.024737 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-ovndb-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.024768 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-internal-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.024788 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-httpd-config\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.024804 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7ks\" (UniqueName: \"kubernetes.io/projected/b140e797-51c4-4f37-9062-a604eef8c280-kube-api-access-bd7ks\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.024826 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-public-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.024850 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-combined-ca-bundle\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.127029 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-ovndb-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.127095 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-internal-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.127112 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-httpd-config\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.127129 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7ks\" (UniqueName: \"kubernetes.io/projected/b140e797-51c4-4f37-9062-a604eef8c280-kube-api-access-bd7ks\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.127155 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-public-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.127178 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-combined-ca-bundle\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.127233 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-config\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.131752 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-config\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.134522 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-internal-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.135574 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-public-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.135976 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-httpd-config\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.147249 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-ovndb-tls-certs\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.147766 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b140e797-51c4-4f37-9062-a604eef8c280-combined-ca-bundle\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.157683 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7ks\" (UniqueName: \"kubernetes.io/projected/b140e797-51c4-4f37-9062-a604eef8c280-kube-api-access-bd7ks\") pod \"neutron-69bfb64645-4wbwh\" (UID: \"b140e797-51c4-4f37-9062-a604eef8c280\") " pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.183517 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.691197 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k52t8" event={"ID":"c9958ea9-408e-4b14-8b23-dd1662654cd1","Type":"ContainerStarted","Data":"ad622983d2e249b22159fc9ce9068573954aa8aeaf612e174db930d57a095e88"} Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.710279 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.710301 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.711555 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" event={"ID":"57ddc390-73c9-44d4-941d-63f506633035","Type":"ContainerStarted","Data":"e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b"} Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.712369 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.713782 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-k52t8" podStartSLOduration=3.204703895 podStartE2EDuration="49.713766796s" podCreationTimestamp="2025-09-30 19:48:59 +0000 UTC" firstStartedPulling="2025-09-30 19:49:01.491655634 +0000 UTC m=+994.691157764" lastFinishedPulling="2025-09-30 19:49:48.000718525 +0000 UTC m=+1041.200220665" observedRunningTime="2025-09-30 19:49:48.710024046 +0000 UTC m=+1041.909526176" watchObservedRunningTime="2025-09-30 19:49:48.713766796 +0000 UTC m=+1041.913268926" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.810211 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" podStartSLOduration=4.810190865 podStartE2EDuration="4.810190865s" podCreationTimestamp="2025-09-30 19:49:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:48.744732831 +0000 UTC m=+1041.944234961" watchObservedRunningTime="2025-09-30 19:49:48.810190865 +0000 UTC m=+1042.009692995" Sep 30 19:49:48 crc kubenswrapper[4553]: I0930 19:49:48.855664 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69bfb64645-4wbwh"] Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.499704 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.501683 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.607835 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.607938 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.734296 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69bfb64645-4wbwh" event={"ID":"b140e797-51c4-4f37-9062-a604eef8c280","Type":"ContainerStarted","Data":"923324643a73b7a00d31acd1a3993ac19902de1536b4403a8f07c3678f35ed5f"} Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.734329 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69bfb64645-4wbwh" event={"ID":"b140e797-51c4-4f37-9062-a604eef8c280","Type":"ContainerStarted","Data":"1d2cc18c05f4083378141384600adbbd9e53b7b19757a689c3e04b4d19fe5e6e"} Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.734355 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69bfb64645-4wbwh" event={"ID":"b140e797-51c4-4f37-9062-a604eef8c280","Type":"ContainerStarted","Data":"554dca5fdc750c04753ff6b9c5287a306d6cbcaf1f8cc106ec0552bf97d4373d"} Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.734847 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.758433 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69bfb64645-4wbwh" podStartSLOduration=2.758411142 podStartE2EDuration="2.758411142s" podCreationTimestamp="2025-09-30 19:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:49:49.751287001 +0000 UTC m=+1042.950789151" watchObservedRunningTime="2025-09-30 19:49:49.758411142 +0000 UTC m=+1042.957913272" Sep 30 19:49:49 crc kubenswrapper[4553]: I0930 19:49:49.832483 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 19:49:50 crc kubenswrapper[4553]: I0930 19:49:50.743080 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-prf67" event={"ID":"04f1abd5-5975-4038-98b3-4b6ff0e858f7","Type":"ContainerStarted","Data":"e723bb837bff29cbcd7be40ca76e68a47b2270b22b4ad9ed4b84c32865f45688"} Sep 30 19:49:51 crc kubenswrapper[4553]: I0930 19:49:51.429369 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Sep 30 19:49:51 crc kubenswrapper[4553]: I0930 19:49:51.555744 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-868c6b469d-rhw7t" podUID="849f4ec8-2741-4c83-82d8-135a24b43447" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Sep 30 19:49:51 crc kubenswrapper[4553]: I0930 19:49:51.756582 4553 generic.go:334] "Generic (PLEG): container finished" podID="c9958ea9-408e-4b14-8b23-dd1662654cd1" containerID="ad622983d2e249b22159fc9ce9068573954aa8aeaf612e174db930d57a095e88" exitCode=0 Sep 30 19:49:51 crc kubenswrapper[4553]: I0930 19:49:51.756666 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k52t8" event={"ID":"c9958ea9-408e-4b14-8b23-dd1662654cd1","Type":"ContainerDied","Data":"ad622983d2e249b22159fc9ce9068573954aa8aeaf612e174db930d57a095e88"} Sep 30 19:49:51 crc kubenswrapper[4553]: I0930 19:49:51.779358 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-prf67" podStartSLOduration=3.47584639 podStartE2EDuration="52.779342143s" podCreationTimestamp="2025-09-30 19:48:59 +0000 UTC" firstStartedPulling="2025-09-30 19:49:00.679376408 +0000 UTC m=+993.878878538" lastFinishedPulling="2025-09-30 19:49:49.982872161 +0000 UTC m=+1043.182374291" observedRunningTime="2025-09-30 19:49:51.774344078 +0000 UTC m=+1044.973846218" watchObservedRunningTime="2025-09-30 19:49:51.779342143 +0000 UTC m=+1044.978844273" Sep 30 19:49:54 crc kubenswrapper[4553]: I0930 19:49:54.642223 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:49:54 crc kubenswrapper[4553]: I0930 19:49:54.696837 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rcrn7"] Sep 30 19:49:54 crc kubenswrapper[4553]: I0930 19:49:54.697106 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" podUID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" containerName="dnsmasq-dns" containerID="cri-o://d8169d85f4db9e47515e590e5c07105ad5c3777fd24627357301b6121802eeb8" gracePeriod=10 Sep 30 19:49:55 crc kubenswrapper[4553]: I0930 19:49:55.790572 4553 generic.go:334] "Generic (PLEG): container finished" podID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" containerID="d8169d85f4db9e47515e590e5c07105ad5c3777fd24627357301b6121802eeb8" exitCode=0 Sep 30 19:49:55 crc kubenswrapper[4553]: I0930 19:49:55.790621 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" event={"ID":"8ab14a4f-024a-4e42-96a2-6ca958df01f5","Type":"ContainerDied","Data":"d8169d85f4db9e47515e590e5c07105ad5c3777fd24627357301b6121802eeb8"} Sep 30 19:49:56 crc kubenswrapper[4553]: I0930 19:49:56.815388 4553 generic.go:334] "Generic (PLEG): container finished" podID="04f1abd5-5975-4038-98b3-4b6ff0e858f7" containerID="e723bb837bff29cbcd7be40ca76e68a47b2270b22b4ad9ed4b84c32865f45688" exitCode=0 Sep 30 19:49:56 crc kubenswrapper[4553]: I0930 19:49:56.815483 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-prf67" event={"ID":"04f1abd5-5975-4038-98b3-4b6ff0e858f7","Type":"ContainerDied","Data":"e723bb837bff29cbcd7be40ca76e68a47b2270b22b4ad9ed4b84c32865f45688"} Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.132923 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k52t8" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.165632 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.166407 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-prf67" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.220493 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-config\") pod \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.220805 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-db-sync-config-data\") pod \"c9958ea9-408e-4b14-8b23-dd1662654cd1\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.220847 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04f1abd5-5975-4038-98b3-4b6ff0e858f7-etc-machine-id\") pod \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.220876 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-nb\") pod \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.220936 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-svc\") pod \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.220961 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdh9d\" (UniqueName: \"kubernetes.io/projected/04f1abd5-5975-4038-98b3-4b6ff0e858f7-kube-api-access-zdh9d\") pod \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221010 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-config-data\") pod \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221142 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-sb\") pod \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221158 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cplpz\" (UniqueName: \"kubernetes.io/projected/c9958ea9-408e-4b14-8b23-dd1662654cd1-kube-api-access-cplpz\") pod \"c9958ea9-408e-4b14-8b23-dd1662654cd1\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221180 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-swift-storage-0\") pod \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221223 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-combined-ca-bundle\") pod \"c9958ea9-408e-4b14-8b23-dd1662654cd1\" (UID: \"c9958ea9-408e-4b14-8b23-dd1662654cd1\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221254 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-db-sync-config-data\") pod \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221303 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-combined-ca-bundle\") pod \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221329 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckbbf\" (UniqueName: \"kubernetes.io/projected/8ab14a4f-024a-4e42-96a2-6ca958df01f5-kube-api-access-ckbbf\") pod \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\" (UID: \"8ab14a4f-024a-4e42-96a2-6ca958df01f5\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221347 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-scripts\") pod \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\" (UID: \"04f1abd5-5975-4038-98b3-4b6ff0e858f7\") " Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.221869 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04f1abd5-5975-4038-98b3-4b6ff0e858f7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "04f1abd5-5975-4038-98b3-4b6ff0e858f7" (UID: "04f1abd5-5975-4038-98b3-4b6ff0e858f7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.222476 4553 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04f1abd5-5975-4038-98b3-4b6ff0e858f7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.253527 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "04f1abd5-5975-4038-98b3-4b6ff0e858f7" (UID: "04f1abd5-5975-4038-98b3-4b6ff0e858f7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.259133 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9958ea9-408e-4b14-8b23-dd1662654cd1-kube-api-access-cplpz" (OuterVolumeSpecName: "kube-api-access-cplpz") pod "c9958ea9-408e-4b14-8b23-dd1662654cd1" (UID: "c9958ea9-408e-4b14-8b23-dd1662654cd1"). InnerVolumeSpecName "kube-api-access-cplpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.259255 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f1abd5-5975-4038-98b3-4b6ff0e858f7-kube-api-access-zdh9d" (OuterVolumeSpecName: "kube-api-access-zdh9d") pod "04f1abd5-5975-4038-98b3-4b6ff0e858f7" (UID: "04f1abd5-5975-4038-98b3-4b6ff0e858f7"). InnerVolumeSpecName "kube-api-access-zdh9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.259813 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab14a4f-024a-4e42-96a2-6ca958df01f5-kube-api-access-ckbbf" (OuterVolumeSpecName: "kube-api-access-ckbbf") pod "8ab14a4f-024a-4e42-96a2-6ca958df01f5" (UID: "8ab14a4f-024a-4e42-96a2-6ca958df01f5"). InnerVolumeSpecName "kube-api-access-ckbbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.260224 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c9958ea9-408e-4b14-8b23-dd1662654cd1" (UID: "c9958ea9-408e-4b14-8b23-dd1662654cd1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.276782 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-scripts" (OuterVolumeSpecName: "scripts") pod "04f1abd5-5975-4038-98b3-4b6ff0e858f7" (UID: "04f1abd5-5975-4038-98b3-4b6ff0e858f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.314022 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04f1abd5-5975-4038-98b3-4b6ff0e858f7" (UID: "04f1abd5-5975-4038-98b3-4b6ff0e858f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.323673 4553 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.323704 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdh9d\" (UniqueName: \"kubernetes.io/projected/04f1abd5-5975-4038-98b3-4b6ff0e858f7-kube-api-access-zdh9d\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.323716 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cplpz\" (UniqueName: \"kubernetes.io/projected/c9958ea9-408e-4b14-8b23-dd1662654cd1-kube-api-access-cplpz\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.323724 4553 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.323733 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.323741 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckbbf\" (UniqueName: \"kubernetes.io/projected/8ab14a4f-024a-4e42-96a2-6ca958df01f5-kube-api-access-ckbbf\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.323750 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.336259 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8ab14a4f-024a-4e42-96a2-6ca958df01f5" (UID: "8ab14a4f-024a-4e42-96a2-6ca958df01f5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.353375 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9958ea9-408e-4b14-8b23-dd1662654cd1" (UID: "c9958ea9-408e-4b14-8b23-dd1662654cd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.424989 4553 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.425012 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9958ea9-408e-4b14-8b23-dd1662654cd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.454487 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8ab14a4f-024a-4e42-96a2-6ca958df01f5" (UID: "8ab14a4f-024a-4e42-96a2-6ca958df01f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.454682 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-config-data" (OuterVolumeSpecName: "config-data") pod "04f1abd5-5975-4038-98b3-4b6ff0e858f7" (UID: "04f1abd5-5975-4038-98b3-4b6ff0e858f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.469483 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8ab14a4f-024a-4e42-96a2-6ca958df01f5" (UID: "8ab14a4f-024a-4e42-96a2-6ca958df01f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.474478 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-config" (OuterVolumeSpecName: "config") pod "8ab14a4f-024a-4e42-96a2-6ca958df01f5" (UID: "8ab14a4f-024a-4e42-96a2-6ca958df01f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.494816 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8ab14a4f-024a-4e42-96a2-6ca958df01f5" (UID: "8ab14a4f-024a-4e42-96a2-6ca958df01f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.526956 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.526992 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.527002 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.527013 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8ab14a4f-024a-4e42-96a2-6ca958df01f5-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.527022 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f1abd5-5975-4038-98b3-4b6ff0e858f7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.843857 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k52t8" event={"ID":"c9958ea9-408e-4b14-8b23-dd1662654cd1","Type":"ContainerDied","Data":"31d5a871755fb18d5b9ea89b5e5a3bd1fc66efa43b7c0b7836db9e1f04bfcafd"} Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.844348 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31d5a871755fb18d5b9ea89b5e5a3bd1fc66efa43b7c0b7836db9e1f04bfcafd" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.843898 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k52t8" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.846309 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" event={"ID":"8ab14a4f-024a-4e42-96a2-6ca958df01f5","Type":"ContainerDied","Data":"52d6307a7def104c10fc939cfee7d8820cf9982f43599ff93d67bc821c0d0c33"} Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.846341 4553 scope.go:117] "RemoveContainer" containerID="d8169d85f4db9e47515e590e5c07105ad5c3777fd24627357301b6121802eeb8" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.846463 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-rcrn7" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.853387 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-prf67" event={"ID":"04f1abd5-5975-4038-98b3-4b6ff0e858f7","Type":"ContainerDied","Data":"c849e2e448fb5bf4e0f6ae6bb9d6b09353372671d50d9ec4a697e6cedacca6e9"} Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.853432 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c849e2e448fb5bf4e0f6ae6bb9d6b09353372671d50d9ec4a697e6cedacca6e9" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.853491 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-prf67" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.882160 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerStarted","Data":"ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1"} Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.882375 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="ceilometer-central-agent" containerID="cri-o://eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14" gracePeriod=30 Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.882508 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.882537 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="proxy-httpd" containerID="cri-o://ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1" gracePeriod=30 Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.882593 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="sg-core" containerID="cri-o://c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716" gracePeriod=30 Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.882636 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="ceilometer-notification-agent" containerID="cri-o://56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f" gracePeriod=30 Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.912302 4553 scope.go:117] "RemoveContainer" containerID="ae2c0e9e2a23f40fe2430e153a1ed8ff54b7e8439afa5d67168fea8d1931ff83" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.912453 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.688757043 podStartE2EDuration="59.912440771s" podCreationTimestamp="2025-09-30 19:48:59 +0000 UTC" firstStartedPulling="2025-09-30 19:49:00.85352358 +0000 UTC m=+994.053025710" lastFinishedPulling="2025-09-30 19:49:58.077207308 +0000 UTC m=+1051.276709438" observedRunningTime="2025-09-30 19:49:58.904730164 +0000 UTC m=+1052.104232314" watchObservedRunningTime="2025-09-30 19:49:58.912440771 +0000 UTC m=+1052.111942901" Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.953137 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rcrn7"] Sep 30 19:49:58 crc kubenswrapper[4553]: I0930 19:49:58.960019 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-rcrn7"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.143590 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:49:59 crc kubenswrapper[4553]: E0930 19:49:59.143935 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9958ea9-408e-4b14-8b23-dd1662654cd1" containerName="barbican-db-sync" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.143948 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9958ea9-408e-4b14-8b23-dd1662654cd1" containerName="barbican-db-sync" Sep 30 19:49:59 crc kubenswrapper[4553]: E0930 19:49:59.143963 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" containerName="dnsmasq-dns" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.143969 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" containerName="dnsmasq-dns" Sep 30 19:49:59 crc kubenswrapper[4553]: E0930 19:49:59.143984 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" containerName="init" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.143990 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" containerName="init" Sep 30 19:49:59 crc kubenswrapper[4553]: E0930 19:49:59.143999 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f1abd5-5975-4038-98b3-4b6ff0e858f7" containerName="cinder-db-sync" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.144006 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f1abd5-5975-4038-98b3-4b6ff0e858f7" containerName="cinder-db-sync" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.144189 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9958ea9-408e-4b14-8b23-dd1662654cd1" containerName="barbican-db-sync" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.144207 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f1abd5-5975-4038-98b3-4b6ff0e858f7" containerName="cinder-db-sync" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.144215 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" containerName="dnsmasq-dns" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.148262 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.151742 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tvsgj" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.151978 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.156573 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.158412 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.190983 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.246998 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-scripts\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.247248 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6fc\" (UniqueName: \"kubernetes.io/projected/7e985377-9ff3-424f-9347-7841e2e60426-kube-api-access-bn6fc\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.247354 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e985377-9ff3-424f-9347-7841e2e60426-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.247446 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.247636 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.247724 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.262413 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b895b5785-876hk"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.263861 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.272890 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-876hk"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.351951 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-svc\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352008 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e985377-9ff3-424f-9347-7841e2e60426-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352035 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352116 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352140 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqlgq\" (UniqueName: \"kubernetes.io/projected/9867b5d3-4a23-463c-bb9c-1043b9521fe6-kube-api-access-xqlgq\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352164 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352202 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352232 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352249 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-config\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352291 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-scripts\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352310 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352325 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6fc\" (UniqueName: \"kubernetes.io/projected/7e985377-9ff3-424f-9347-7841e2e60426-kube-api-access-bn6fc\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.352634 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e985377-9ff3-424f-9347-7841e2e60426-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.395241 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-scripts\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.395390 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.395634 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.395812 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.399495 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6fc\" (UniqueName: \"kubernetes.io/projected/7e985377-9ff3-424f-9347-7841e2e60426-kube-api-access-bn6fc\") pod \"cinder-scheduler-0\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.457992 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.458076 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-svc\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.458108 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.458128 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqlgq\" (UniqueName: \"kubernetes.io/projected/9867b5d3-4a23-463c-bb9c-1043b9521fe6-kube-api-access-xqlgq\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.458152 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.458209 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-config\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.459057 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-config\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.459586 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.460079 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-svc\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.460568 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.461569 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.484014 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.524215 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqlgq\" (UniqueName: \"kubernetes.io/projected/9867b5d3-4a23-463c-bb9c-1043b9521fe6-kube-api-access-xqlgq\") pod \"dnsmasq-dns-b895b5785-876hk\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.525673 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab14a4f-024a-4e42-96a2-6ca958df01f5" path="/var/lib/kubelet/pods/8ab14a4f-024a-4e42-96a2-6ca958df01f5/volumes" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.584923 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.600084 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.604901 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.629740 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.645469 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.671456 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.671505 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.671553 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v78c2\" (UniqueName: \"kubernetes.io/projected/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-kube-api-access-v78c2\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.671623 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-logs\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.671663 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-scripts\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.671680 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data-custom\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.671730 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.778353 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-logs\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.778419 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data-custom\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.778436 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-scripts\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.778479 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.778515 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.778534 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.778586 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v78c2\" (UniqueName: \"kubernetes.io/projected/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-kube-api-access-v78c2\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.779305 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-logs\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.779840 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.805129 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data-custom\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.808021 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.809696 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-scripts\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.810425 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.813336 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56db9cccd9-gb669"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.814670 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.837735 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fb95k" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.847736 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.847911 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.863551 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5f4f58b95d-2zhxk"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.864866 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v78c2\" (UniqueName: \"kubernetes.io/projected/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-kube-api-access-v78c2\") pod \"cinder-api-0\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.879886 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-config-data-custom\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.879958 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-config-data\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.879989 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttj5l\" (UniqueName: \"kubernetes.io/projected/38816918-17bb-4279-8b49-b9d696171461-kube-api-access-ttj5l\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.880005 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38816918-17bb-4279-8b49-b9d696171461-logs\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.880030 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-combined-ca-bundle\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.880165 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.908131 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.908359 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56db9cccd9-gb669"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.916108 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f4f58b95d-2zhxk"] Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.947670 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.959971 4553 generic.go:334] "Generic (PLEG): container finished" podID="906fbd6e-e72f-428f-b182-f583c009fc93" containerID="ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1" exitCode=0 Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.960010 4553 generic.go:334] "Generic (PLEG): container finished" podID="906fbd6e-e72f-428f-b182-f583c009fc93" containerID="c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716" exitCode=2 Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.960094 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerDied","Data":"ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1"} Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.960124 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerDied","Data":"c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716"} Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981404 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttj5l\" (UniqueName: \"kubernetes.io/projected/38816918-17bb-4279-8b49-b9d696171461-kube-api-access-ttj5l\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981452 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38816918-17bb-4279-8b49-b9d696171461-logs\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981486 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-combined-ca-bundle\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981543 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-config-data\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981571 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d39e00ce-6b28-4add-98a2-e4330753f27e-logs\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981589 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-combined-ca-bundle\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981623 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48cfv\" (UniqueName: \"kubernetes.io/projected/d39e00ce-6b28-4add-98a2-e4330753f27e-kube-api-access-48cfv\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981656 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-config-data-custom\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981679 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-config-data-custom\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.981724 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-config-data\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.983176 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38816918-17bb-4279-8b49-b9d696171461-logs\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.995612 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-config-data\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:49:59 crc kubenswrapper[4553]: I0930 19:49:59.996721 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-config-data-custom\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.004092 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38816918-17bb-4279-8b49-b9d696171461-combined-ca-bundle\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.025191 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-876hk"] Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.068336 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttj5l\" (UniqueName: \"kubernetes.io/projected/38816918-17bb-4279-8b49-b9d696171461-kube-api-access-ttj5l\") pod \"barbican-worker-56db9cccd9-gb669\" (UID: \"38816918-17bb-4279-8b49-b9d696171461\") " pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.094168 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-combined-ca-bundle\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.094238 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48cfv\" (UniqueName: \"kubernetes.io/projected/d39e00ce-6b28-4add-98a2-e4330753f27e-kube-api-access-48cfv\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.094276 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-config-data-custom\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.094384 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-config-data\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.094407 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d39e00ce-6b28-4add-98a2-e4330753f27e-logs\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.094782 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d39e00ce-6b28-4add-98a2-e4330753f27e-logs\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.106604 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-config-data\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.112658 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-combined-ca-bundle\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.113207 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d39e00ce-6b28-4add-98a2-e4330753f27e-config-data-custom\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.113804 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fk7nz"] Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.126605 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.155312 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48cfv\" (UniqueName: \"kubernetes.io/projected/d39e00ce-6b28-4add-98a2-e4330753f27e-kube-api-access-48cfv\") pod \"barbican-keystone-listener-5f4f58b95d-2zhxk\" (UID: \"d39e00ce-6b28-4add-98a2-e4330753f27e\") " pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.156721 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fk7nz"] Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.199454 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.199509 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb84d\" (UniqueName: \"kubernetes.io/projected/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-kube-api-access-pb84d\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.199540 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-config\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.199647 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.199686 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.199723 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.230545 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56db9cccd9-gb669" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.268292 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.308990 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.309057 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.309084 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.309124 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.309144 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb84d\" (UniqueName: \"kubernetes.io/projected/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-kube-api-access-pb84d\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.309184 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-config\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.309976 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-config\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.312118 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.312692 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.313306 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.313861 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.318110 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-775f8d6cf8-484p6"] Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.319692 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.322219 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.367107 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-775f8d6cf8-484p6"] Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.406922 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb84d\" (UniqueName: \"kubernetes.io/projected/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-kube-api-access-pb84d\") pod \"dnsmasq-dns-5c9776ccc5-fk7nz\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.425230 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.426203 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa486e18-3471-4970-b3b3-495d02626b6d-logs\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.426291 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data-custom\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.426427 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sfnl\" (UniqueName: \"kubernetes.io/projected/fa486e18-3471-4970-b3b3-495d02626b6d-kube-api-access-5sfnl\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.426637 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-combined-ca-bundle\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.513083 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.535505 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.537507 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-combined-ca-bundle\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.537639 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.537681 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa486e18-3471-4970-b3b3-495d02626b6d-logs\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.537716 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data-custom\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.537761 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sfnl\" (UniqueName: \"kubernetes.io/projected/fa486e18-3471-4970-b3b3-495d02626b6d-kube-api-access-5sfnl\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.540871 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa486e18-3471-4970-b3b3-495d02626b6d-logs\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.551167 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-combined-ca-bundle\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.555516 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data-custom\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.557573 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sfnl\" (UniqueName: \"kubernetes.io/projected/fa486e18-3471-4970-b3b3-495d02626b6d-kube-api-access-5sfnl\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.558092 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data\") pod \"barbican-api-775f8d6cf8-484p6\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.779517 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-876hk"] Sep 30 19:50:00 crc kubenswrapper[4553]: I0930 19:50:00.811423 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.015872 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.094299 4553 generic.go:334] "Generic (PLEG): container finished" podID="906fbd6e-e72f-428f-b182-f583c009fc93" containerID="eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14" exitCode=0 Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.094409 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerDied","Data":"eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14"} Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.107078 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7e985377-9ff3-424f-9347-7841e2e60426","Type":"ContainerStarted","Data":"ee35be9dc417cf306eeb6980e6e8b3a52b8d1cbfe89afad55ef47e2560d2cb72"} Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.108238 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-876hk" event={"ID":"9867b5d3-4a23-463c-bb9c-1043b9521fe6","Type":"ContainerStarted","Data":"939d97ed2f9f0a939fd1912fe48309834d4316d509baada46036e6c5b70a664c"} Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.128155 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f4f58b95d-2zhxk"] Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.142082 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56db9cccd9-gb669"] Sep 30 19:50:01 crc kubenswrapper[4553]: W0930 19:50:01.152735 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38816918_17bb_4279_8b49_b9d696171461.slice/crio-b83c525628460135b2ce411a2aac390433733f8f550d4eebfacd16062626e236 WatchSource:0}: Error finding container b83c525628460135b2ce411a2aac390433733f8f550d4eebfacd16062626e236: Status 404 returned error can't find the container with id b83c525628460135b2ce411a2aac390433733f8f550d4eebfacd16062626e236 Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.308134 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fk7nz"] Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.429723 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.557627 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-868c6b469d-rhw7t" podUID="849f4ec8-2741-4c83-82d8-135a24b43447" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Sep 30 19:50:01 crc kubenswrapper[4553]: I0930 19:50:01.579172 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-775f8d6cf8-484p6"] Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.129118 4553 generic.go:334] "Generic (PLEG): container finished" podID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" containerID="b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220" exitCode=0 Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.129378 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" event={"ID":"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2","Type":"ContainerDied","Data":"b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.129401 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" event={"ID":"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2","Type":"ContainerStarted","Data":"94d848d60f1db24cc760317d8999e82eaf2a8a0f6f7adc47bf0a020216b77214"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.134839 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-775f8d6cf8-484p6" event={"ID":"fa486e18-3471-4970-b3b3-495d02626b6d","Type":"ContainerStarted","Data":"9f7fb7190b1cd7fd6d2551d47e1560078d150ed76ec77d76fb59d43cf8c70f2d"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.134871 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-775f8d6cf8-484p6" event={"ID":"fa486e18-3471-4970-b3b3-495d02626b6d","Type":"ContainerStarted","Data":"8db8dc70fa1d633022f260144132246e1b915ff79b392131017f234c050e4396"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.134884 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.134909 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.158761 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.159912 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707","Type":"ContainerStarted","Data":"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.159948 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707","Type":"ContainerStarted","Data":"03ae7e0ba9a5e80c37bdeb665bcc45d0b644b01b2cb8833359f49d4e157504cd"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.168600 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" event={"ID":"d39e00ce-6b28-4add-98a2-e4330753f27e","Type":"ContainerStarted","Data":"3a5e32b3111aee2422226a35cda944e55d3d51788c231495bf617aea18d61d54"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.170797 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56db9cccd9-gb669" event={"ID":"38816918-17bb-4279-8b49-b9d696171461","Type":"ContainerStarted","Data":"b83c525628460135b2ce411a2aac390433733f8f550d4eebfacd16062626e236"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.179825 4553 generic.go:334] "Generic (PLEG): container finished" podID="9867b5d3-4a23-463c-bb9c-1043b9521fe6" containerID="69d9bf864ec878ef5f20ad018952a00789f7f0ac31bc74516bc74e42caf8c1fd" exitCode=0 Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.179864 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-876hk" event={"ID":"9867b5d3-4a23-463c-bb9c-1043b9521fe6","Type":"ContainerDied","Data":"69d9bf864ec878ef5f20ad018952a00789f7f0ac31bc74516bc74e42caf8c1fd"} Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.204026 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-775f8d6cf8-484p6" podStartSLOduration=2.204005397 podStartE2EDuration="2.204005397s" podCreationTimestamp="2025-09-30 19:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:02.19448087 +0000 UTC m=+1055.393983000" watchObservedRunningTime="2025-09-30 19:50:02.204005397 +0000 UTC m=+1055.403507527" Sep 30 19:50:02 crc kubenswrapper[4553]: I0930 19:50:02.857585 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.035691 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-svc\") pod \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.035983 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqlgq\" (UniqueName: \"kubernetes.io/projected/9867b5d3-4a23-463c-bb9c-1043b9521fe6-kube-api-access-xqlgq\") pod \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.036023 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-sb\") pod \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.036107 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-swift-storage-0\") pod \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.036162 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-nb\") pod \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.036296 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-config\") pod \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\" (UID: \"9867b5d3-4a23-463c-bb9c-1043b9521fe6\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.051151 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9867b5d3-4a23-463c-bb9c-1043b9521fe6-kube-api-access-xqlgq" (OuterVolumeSpecName: "kube-api-access-xqlgq") pod "9867b5d3-4a23-463c-bb9c-1043b9521fe6" (UID: "9867b5d3-4a23-463c-bb9c-1043b9521fe6"). InnerVolumeSpecName "kube-api-access-xqlgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.088618 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9867b5d3-4a23-463c-bb9c-1043b9521fe6" (UID: "9867b5d3-4a23-463c-bb9c-1043b9521fe6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.091198 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9867b5d3-4a23-463c-bb9c-1043b9521fe6" (UID: "9867b5d3-4a23-463c-bb9c-1043b9521fe6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.104299 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9867b5d3-4a23-463c-bb9c-1043b9521fe6" (UID: "9867b5d3-4a23-463c-bb9c-1043b9521fe6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.131353 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9867b5d3-4a23-463c-bb9c-1043b9521fe6" (UID: "9867b5d3-4a23-463c-bb9c-1043b9521fe6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.137289 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.137313 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqlgq\" (UniqueName: \"kubernetes.io/projected/9867b5d3-4a23-463c-bb9c-1043b9521fe6-kube-api-access-xqlgq\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.137326 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.137334 4553 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.137346 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.139493 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-config" (OuterVolumeSpecName: "config") pod "9867b5d3-4a23-463c-bb9c-1043b9521fe6" (UID: "9867b5d3-4a23-463c-bb9c-1043b9521fe6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.151982 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.229148 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-775f8d6cf8-484p6" event={"ID":"fa486e18-3471-4970-b3b3-495d02626b6d","Type":"ContainerStarted","Data":"b3a516f1382bbca1abbebf5f177c7424716a45fe2ad25608e0e2bbfbe597bdae"} Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.232267 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-876hk" event={"ID":"9867b5d3-4a23-463c-bb9c-1043b9521fe6","Type":"ContainerDied","Data":"939d97ed2f9f0a939fd1912fe48309834d4316d509baada46036e6c5b70a664c"} Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.232311 4553 scope.go:117] "RemoveContainer" containerID="69d9bf864ec878ef5f20ad018952a00789f7f0ac31bc74516bc74e42caf8c1fd" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.232321 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-876hk" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.238530 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-run-httpd\") pod \"906fbd6e-e72f-428f-b182-f583c009fc93\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.238677 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-sg-core-conf-yaml\") pod \"906fbd6e-e72f-428f-b182-f583c009fc93\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.238728 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-log-httpd\") pod \"906fbd6e-e72f-428f-b182-f583c009fc93\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.238749 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-config-data\") pod \"906fbd6e-e72f-428f-b182-f583c009fc93\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.238796 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-combined-ca-bundle\") pod \"906fbd6e-e72f-428f-b182-f583c009fc93\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.238830 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7gfm\" (UniqueName: \"kubernetes.io/projected/906fbd6e-e72f-428f-b182-f583c009fc93-kube-api-access-s7gfm\") pod \"906fbd6e-e72f-428f-b182-f583c009fc93\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.238856 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-scripts\") pod \"906fbd6e-e72f-428f-b182-f583c009fc93\" (UID: \"906fbd6e-e72f-428f-b182-f583c009fc93\") " Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.239315 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9867b5d3-4a23-463c-bb9c-1043b9521fe6-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.241072 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "906fbd6e-e72f-428f-b182-f583c009fc93" (UID: "906fbd6e-e72f-428f-b182-f583c009fc93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.241142 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "906fbd6e-e72f-428f-b182-f583c009fc93" (UID: "906fbd6e-e72f-428f-b182-f583c009fc93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.244242 4553 generic.go:334] "Generic (PLEG): container finished" podID="906fbd6e-e72f-428f-b182-f583c009fc93" containerID="56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f" exitCode=0 Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.244329 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.244295 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerDied","Data":"56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f"} Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.244596 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906fbd6e-e72f-428f-b182-f583c009fc93","Type":"ContainerDied","Data":"37e912b00f4c9f782f11089f0f3fac81a5cbbb16025730f787d3ab4daa71bb8d"} Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.253954 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-scripts" (OuterVolumeSpecName: "scripts") pod "906fbd6e-e72f-428f-b182-f583c009fc93" (UID: "906fbd6e-e72f-428f-b182-f583c009fc93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.254095 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906fbd6e-e72f-428f-b182-f583c009fc93-kube-api-access-s7gfm" (OuterVolumeSpecName: "kube-api-access-s7gfm") pod "906fbd6e-e72f-428f-b182-f583c009fc93" (UID: "906fbd6e-e72f-428f-b182-f583c009fc93"). InnerVolumeSpecName "kube-api-access-s7gfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.262419 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7e985377-9ff3-424f-9347-7841e2e60426","Type":"ContainerStarted","Data":"31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412"} Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.269355 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "906fbd6e-e72f-428f-b182-f583c009fc93" (UID: "906fbd6e-e72f-428f-b182-f583c009fc93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.276250 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" event={"ID":"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2","Type":"ContainerStarted","Data":"daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98"} Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.277063 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.299471 4553 scope.go:117] "RemoveContainer" containerID="ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.332118 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-876hk"] Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.372940 4553 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.372979 4553 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.372989 4553 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906fbd6e-e72f-428f-b182-f583c009fc93-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.373010 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7gfm\" (UniqueName: \"kubernetes.io/projected/906fbd6e-e72f-428f-b182-f583c009fc93-kube-api-access-s7gfm\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.373022 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.395546 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-876hk"] Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.396760 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" podStartSLOduration=3.396739543 podStartE2EDuration="3.396739543s" podCreationTimestamp="2025-09-30 19:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:03.348320108 +0000 UTC m=+1056.547822238" watchObservedRunningTime="2025-09-30 19:50:03.396739543 +0000 UTC m=+1056.596241673" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.444599 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "906fbd6e-e72f-428f-b182-f583c009fc93" (UID: "906fbd6e-e72f-428f-b182-f583c009fc93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.474756 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.521807 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-config-data" (OuterVolumeSpecName: "config-data") pod "906fbd6e-e72f-428f-b182-f583c009fc93" (UID: "906fbd6e-e72f-428f-b182-f583c009fc93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.530887 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9867b5d3-4a23-463c-bb9c-1043b9521fe6" path="/var/lib/kubelet/pods/9867b5d3-4a23-463c-bb9c-1043b9521fe6/volumes" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.573821 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.583343 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906fbd6e-e72f-428f-b182-f583c009fc93-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.588669 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.617527 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:03 crc kubenswrapper[4553]: E0930 19:50:03.617856 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="sg-core" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.617873 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="sg-core" Sep 30 19:50:03 crc kubenswrapper[4553]: E0930 19:50:03.617896 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9867b5d3-4a23-463c-bb9c-1043b9521fe6" containerName="init" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.617904 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="9867b5d3-4a23-463c-bb9c-1043b9521fe6" containerName="init" Sep 30 19:50:03 crc kubenswrapper[4553]: E0930 19:50:03.617917 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="ceilometer-notification-agent" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.617924 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="ceilometer-notification-agent" Sep 30 19:50:03 crc kubenswrapper[4553]: E0930 19:50:03.617934 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="proxy-httpd" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.617939 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="proxy-httpd" Sep 30 19:50:03 crc kubenswrapper[4553]: E0930 19:50:03.617957 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="ceilometer-central-agent" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.617963 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="ceilometer-central-agent" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.618135 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="9867b5d3-4a23-463c-bb9c-1043b9521fe6" containerName="init" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.622557 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="ceilometer-notification-agent" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.622590 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="sg-core" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.622614 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="proxy-httpd" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.622621 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" containerName="ceilometer-central-agent" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.627174 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.631811 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.632009 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.633541 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.690321 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-log-httpd\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.690391 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.690450 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-run-httpd\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.690516 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.690536 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-scripts\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.690599 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-config-data\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.690637 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h2t9\" (UniqueName: \"kubernetes.io/projected/063b4634-4c30-4ddd-ab2e-2c238126621e-kube-api-access-4h2t9\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.792458 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.792773 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-run-httpd\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.792810 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.793358 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-scripts\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.793382 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-config-data\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.793417 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h2t9\" (UniqueName: \"kubernetes.io/projected/063b4634-4c30-4ddd-ab2e-2c238126621e-kube-api-access-4h2t9\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.793475 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-log-httpd\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.793708 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-log-httpd\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.793408 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-run-httpd\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.799753 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.802334 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.803415 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-scripts\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.811730 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-config-data\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.812219 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h2t9\" (UniqueName: \"kubernetes.io/projected/063b4634-4c30-4ddd-ab2e-2c238126621e-kube-api-access-4h2t9\") pod \"ceilometer-0\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " pod="openstack/ceilometer-0" Sep 30 19:50:03 crc kubenswrapper[4553]: I0930 19:50:03.961456 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.296516 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7e985377-9ff3-424f-9347-7841e2e60426","Type":"ContainerStarted","Data":"a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea"} Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.321200 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.45085568 podStartE2EDuration="5.32118147s" podCreationTimestamp="2025-09-30 19:49:59 +0000 UTC" firstStartedPulling="2025-09-30 19:50:00.617773518 +0000 UTC m=+1053.817275648" lastFinishedPulling="2025-09-30 19:50:01.488099318 +0000 UTC m=+1054.687601438" observedRunningTime="2025-09-30 19:50:04.314460609 +0000 UTC m=+1057.513962739" watchObservedRunningTime="2025-09-30 19:50:04.32118147 +0000 UTC m=+1057.520683590" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.323700 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707","Type":"ContainerStarted","Data":"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a"} Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.323909 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerName="cinder-api-log" containerID="cri-o://22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7" gracePeriod=30 Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.324308 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerName="cinder-api" containerID="cri-o://2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a" gracePeriod=30 Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.348025 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.348003362 podStartE2EDuration="5.348003362s" podCreationTimestamp="2025-09-30 19:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:04.342090203 +0000 UTC m=+1057.541592333" watchObservedRunningTime="2025-09-30 19:50:04.348003362 +0000 UTC m=+1057.547505492" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.486856 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 19:50:04 crc kubenswrapper[4553]: E0930 19:50:04.569917 4553 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c5d9e6_51b6_48a5_b0a3_c9a8896c6707.slice/crio-22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c5d9e6_51b6_48a5_b0a3_c9a8896c6707.slice/crio-conmon-22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c5d9e6_51b6_48a5_b0a3_c9a8896c6707.slice/crio-conmon-2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a.scope\": RecentStats: unable to find data in memory cache]" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.601447 4553 scope.go:117] "RemoveContainer" containerID="c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.720835 4553 scope.go:117] "RemoveContainer" containerID="56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.766259 4553 scope.go:117] "RemoveContainer" containerID="eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.814420 4553 scope.go:117] "RemoveContainer" containerID="ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1" Sep 30 19:50:04 crc kubenswrapper[4553]: E0930 19:50:04.840587 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1\": container with ID starting with ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1 not found: ID does not exist" containerID="ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.840636 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1"} err="failed to get container status \"ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1\": rpc error: code = NotFound desc = could not find container \"ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1\": container with ID starting with ef79c7b49a6f73438898daee41199605311a3c77ec954697ef461a9b333d63c1 not found: ID does not exist" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.840662 4553 scope.go:117] "RemoveContainer" containerID="c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716" Sep 30 19:50:04 crc kubenswrapper[4553]: E0930 19:50:04.847638 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716\": container with ID starting with c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716 not found: ID does not exist" containerID="c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.847697 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716"} err="failed to get container status \"c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716\": rpc error: code = NotFound desc = could not find container \"c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716\": container with ID starting with c27e637cb185e60de88caba20847f484972bb2019a3d306286cc5bef0a9e3716 not found: ID does not exist" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.847732 4553 scope.go:117] "RemoveContainer" containerID="56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f" Sep 30 19:50:04 crc kubenswrapper[4553]: E0930 19:50:04.848110 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f\": container with ID starting with 56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f not found: ID does not exist" containerID="56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.848148 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f"} err="failed to get container status \"56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f\": rpc error: code = NotFound desc = could not find container \"56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f\": container with ID starting with 56bc6c36a9d32687a0c688eca63263beb2355f09162b188e93be26dee739122f not found: ID does not exist" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.848166 4553 scope.go:117] "RemoveContainer" containerID="eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14" Sep 30 19:50:04 crc kubenswrapper[4553]: E0930 19:50:04.848405 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14\": container with ID starting with eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14 not found: ID does not exist" containerID="eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.848428 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14"} err="failed to get container status \"eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14\": rpc error: code = NotFound desc = could not find container \"eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14\": container with ID starting with eaa3c6c1e223c9364eb621d094f9988bd88e3ef3bc3795406d83865beacbcb14 not found: ID does not exist" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.948882 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 19:50:04 crc kubenswrapper[4553]: I0930 19:50:04.974725 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.032977 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-logs\") pod \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.033021 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-scripts\") pod \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.033105 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-etc-machine-id\") pod \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.033218 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v78c2\" (UniqueName: \"kubernetes.io/projected/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-kube-api-access-v78c2\") pod \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.033283 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-combined-ca-bundle\") pod \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.033334 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data\") pod \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.033369 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data-custom\") pod \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\" (UID: \"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707\") " Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.034210 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" (UID: "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.034920 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-logs" (OuterVolumeSpecName: "logs") pod "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" (UID: "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.052060 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-scripts" (OuterVolumeSpecName: "scripts") pod "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" (UID: "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.052605 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-kube-api-access-v78c2" (OuterVolumeSpecName: "kube-api-access-v78c2") pod "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" (UID: "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707"). InnerVolumeSpecName "kube-api-access-v78c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.055653 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" (UID: "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.082216 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" (UID: "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.100207 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data" (OuterVolumeSpecName: "config-data") pod "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" (UID: "e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.137084 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v78c2\" (UniqueName: \"kubernetes.io/projected/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-kube-api-access-v78c2\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.137113 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.137123 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.137132 4553 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.137141 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.137150 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.137157 4553 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.312304 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.344097 4553 generic.go:334] "Generic (PLEG): container finished" podID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerID="2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a" exitCode=0 Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.344422 4553 generic.go:334] "Generic (PLEG): container finished" podID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerID="22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7" exitCode=143 Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.344604 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707","Type":"ContainerDied","Data":"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a"} Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.344715 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707","Type":"ContainerDied","Data":"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7"} Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.345440 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707","Type":"ContainerDied","Data":"03ae7e0ba9a5e80c37bdeb665bcc45d0b644b01b2cb8833359f49d4e157504cd"} Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.345551 4553 scope.go:117] "RemoveContainer" containerID="2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.345729 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.353935 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" event={"ID":"d39e00ce-6b28-4add-98a2-e4330753f27e","Type":"ContainerStarted","Data":"ff2785a4b385ccba379b56bac33b06c5b1e886f2fb91d8d8adf53c8563dff1be"} Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.354129 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" event={"ID":"d39e00ce-6b28-4add-98a2-e4330753f27e","Type":"ContainerStarted","Data":"e65aae4d3f9b49cc288cbaeee3c7af945f7f7d8399dabe905788fe590ab8f0bf"} Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.366162 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56db9cccd9-gb669" event={"ID":"38816918-17bb-4279-8b49-b9d696171461","Type":"ContainerStarted","Data":"f4a4e00e4db6042dc39757643622128a314e7db5b70af48056c5ca269998a207"} Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.366356 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56db9cccd9-gb669" event={"ID":"38816918-17bb-4279-8b49-b9d696171461","Type":"ContainerStarted","Data":"d5c3cd681bbcdba0336139619b235764e78d29b65e3306a07f4058116de3a773"} Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.389033 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5f4f58b95d-2zhxk" podStartSLOduration=2.802496759 podStartE2EDuration="6.389012191s" podCreationTimestamp="2025-09-30 19:49:59 +0000 UTC" firstStartedPulling="2025-09-30 19:50:01.143206325 +0000 UTC m=+1054.342708455" lastFinishedPulling="2025-09-30 19:50:04.729721757 +0000 UTC m=+1057.929223887" observedRunningTime="2025-09-30 19:50:05.375572788 +0000 UTC m=+1058.575074918" watchObservedRunningTime="2025-09-30 19:50:05.389012191 +0000 UTC m=+1058.588514321" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.408807 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56db9cccd9-gb669" podStartSLOduration=2.839924048 podStartE2EDuration="6.408790074s" podCreationTimestamp="2025-09-30 19:49:59 +0000 UTC" firstStartedPulling="2025-09-30 19:50:01.16081844 +0000 UTC m=+1054.360320570" lastFinishedPulling="2025-09-30 19:50:04.729684466 +0000 UTC m=+1057.929186596" observedRunningTime="2025-09-30 19:50:05.401818166 +0000 UTC m=+1058.601320296" watchObservedRunningTime="2025-09-30 19:50:05.408790074 +0000 UTC m=+1058.608292194" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.419650 4553 scope.go:117] "RemoveContainer" containerID="22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.449470 4553 scope.go:117] "RemoveContainer" containerID="2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a" Sep 30 19:50:05 crc kubenswrapper[4553]: E0930 19:50:05.449838 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a\": container with ID starting with 2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a not found: ID does not exist" containerID="2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.449902 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a"} err="failed to get container status \"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a\": rpc error: code = NotFound desc = could not find container \"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a\": container with ID starting with 2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a not found: ID does not exist" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.449933 4553 scope.go:117] "RemoveContainer" containerID="22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7" Sep 30 19:50:05 crc kubenswrapper[4553]: E0930 19:50:05.450270 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7\": container with ID starting with 22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7 not found: ID does not exist" containerID="22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.450304 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7"} err="failed to get container status \"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7\": rpc error: code = NotFound desc = could not find container \"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7\": container with ID starting with 22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7 not found: ID does not exist" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.450328 4553 scope.go:117] "RemoveContainer" containerID="2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.450589 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a"} err="failed to get container status \"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a\": rpc error: code = NotFound desc = could not find container \"2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a\": container with ID starting with 2874fbef2c102fbe44e136036e3b08c7a325dbb7485df88013c08ff455c7564a not found: ID does not exist" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.450606 4553 scope.go:117] "RemoveContainer" containerID="22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.450818 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7"} err="failed to get container status \"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7\": rpc error: code = NotFound desc = could not find container \"22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7\": container with ID starting with 22879795d9ab4dcfc8083c610a66784367e606b479e9e4ab257dfb8e3db345a7 not found: ID does not exist" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.472009 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.494411 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.504935 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:50:05 crc kubenswrapper[4553]: E0930 19:50:05.505382 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerName="cinder-api" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.505400 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerName="cinder-api" Sep 30 19:50:05 crc kubenswrapper[4553]: E0930 19:50:05.505431 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerName="cinder-api-log" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.505437 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerName="cinder-api-log" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.505625 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerName="cinder-api" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.505650 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" containerName="cinder-api-log" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.507542 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.516510 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.516736 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.516896 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.521468 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906fbd6e-e72f-428f-b182-f583c009fc93" path="/var/lib/kubelet/pods/906fbd6e-e72f-428f-b182-f583c009fc93/volumes" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.526756 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707" path="/var/lib/kubelet/pods/e3c5d9e6-51b6-48a5-b0a3-c9a8896c6707/volumes" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.532653 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544349 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-config-data\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544409 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2eb0f0-7643-448e-a97e-bd6551fe128e-logs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544431 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-scripts\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544451 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544472 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e2eb0f0-7643-448e-a97e-bd6551fe128e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544509 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544558 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxz4n\" (UniqueName: \"kubernetes.io/projected/4e2eb0f0-7643-448e-a97e-bd6551fe128e-kube-api-access-rxz4n\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544730 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.544858 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646403 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646472 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-config-data\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646506 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2eb0f0-7643-448e-a97e-bd6551fe128e-logs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646533 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-scripts\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646560 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646587 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e2eb0f0-7643-448e-a97e-bd6551fe128e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646614 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646667 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxz4n\" (UniqueName: \"kubernetes.io/projected/4e2eb0f0-7643-448e-a97e-bd6551fe128e-kube-api-access-rxz4n\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.646714 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.648832 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e2eb0f0-7643-448e-a97e-bd6551fe128e-logs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.648987 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e2eb0f0-7643-448e-a97e-bd6551fe128e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.652642 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.653333 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-config-data\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.653900 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.654543 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-config-data-custom\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.655487 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.656238 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e2eb0f0-7643-448e-a97e-bd6551fe128e-scripts\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.669669 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxz4n\" (UniqueName: \"kubernetes.io/projected/4e2eb0f0-7643-448e-a97e-bd6551fe128e-kube-api-access-rxz4n\") pod \"cinder-api-0\" (UID: \"4e2eb0f0-7643-448e-a97e-bd6551fe128e\") " pod="openstack/cinder-api-0" Sep 30 19:50:05 crc kubenswrapper[4553]: I0930 19:50:05.847930 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.300705 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c89dc44dd-6ghsx"] Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.302426 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.304904 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.305022 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.315859 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c89dc44dd-6ghsx"] Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.375005 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.377686 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerStarted","Data":"73346d129b1264e1112e76b2003efdd193384e9cb883ebe76badf1d31387c3a6"} Sep 30 19:50:06 crc kubenswrapper[4553]: W0930 19:50:06.394414 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e2eb0f0_7643_448e_a97e_bd6551fe128e.slice/crio-c515833c34d1021d190d600df37cb9d8841b1219a927fd1dee292b29f86a9122 WatchSource:0}: Error finding container c515833c34d1021d190d600df37cb9d8841b1219a927fd1dee292b29f86a9122: Status 404 returned error can't find the container with id c515833c34d1021d190d600df37cb9d8841b1219a927fd1dee292b29f86a9122 Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.467174 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-config-data-custom\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.467291 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-combined-ca-bundle\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.467342 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-public-tls-certs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.467366 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964c839f-1077-4e46-bbb9-f25807d73ed7-logs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.467462 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-internal-tls-certs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.467578 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-config-data\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.467634 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbq76\" (UniqueName: \"kubernetes.io/projected/964c839f-1077-4e46-bbb9-f25807d73ed7-kube-api-access-zbq76\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.569334 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-combined-ca-bundle\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.569650 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-public-tls-certs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.569754 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964c839f-1077-4e46-bbb9-f25807d73ed7-logs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.569871 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-internal-tls-certs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.570059 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-config-data\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.570149 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbq76\" (UniqueName: \"kubernetes.io/projected/964c839f-1077-4e46-bbb9-f25807d73ed7-kube-api-access-zbq76\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.570456 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-config-data-custom\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.571633 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/964c839f-1077-4e46-bbb9-f25807d73ed7-logs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.577751 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-public-tls-certs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.588379 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-internal-tls-certs\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.599229 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbq76\" (UniqueName: \"kubernetes.io/projected/964c839f-1077-4e46-bbb9-f25807d73ed7-kube-api-access-zbq76\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.603592 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-config-data\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.603624 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-config-data-custom\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.608726 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964c839f-1077-4e46-bbb9-f25807d73ed7-combined-ca-bundle\") pod \"barbican-api-5c89dc44dd-6ghsx\" (UID: \"964c839f-1077-4e46-bbb9-f25807d73ed7\") " pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:06 crc kubenswrapper[4553]: I0930 19:50:06.626369 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:07 crc kubenswrapper[4553]: I0930 19:50:07.328147 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c89dc44dd-6ghsx"] Sep 30 19:50:07 crc kubenswrapper[4553]: I0930 19:50:07.423186 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerStarted","Data":"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b"} Sep 30 19:50:07 crc kubenswrapper[4553]: I0930 19:50:07.423679 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerStarted","Data":"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511"} Sep 30 19:50:07 crc kubenswrapper[4553]: I0930 19:50:07.450250 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c89dc44dd-6ghsx" event={"ID":"964c839f-1077-4e46-bbb9-f25807d73ed7","Type":"ContainerStarted","Data":"1a57e3aec61c9af76c916ea2acfb493594a60be83b39596123b6903f56ed97bb"} Sep 30 19:50:07 crc kubenswrapper[4553]: I0930 19:50:07.466352 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e2eb0f0-7643-448e-a97e-bd6551fe128e","Type":"ContainerStarted","Data":"761a2f3d113914108ae44f5a317d6831f4569e743d20d71c2518cc2b05893591"} Sep 30 19:50:07 crc kubenswrapper[4553]: I0930 19:50:07.466400 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e2eb0f0-7643-448e-a97e-bd6551fe128e","Type":"ContainerStarted","Data":"c515833c34d1021d190d600df37cb9d8841b1219a927fd1dee292b29f86a9122"} Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.479032 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerStarted","Data":"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e"} Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.480792 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c89dc44dd-6ghsx" event={"ID":"964c839f-1077-4e46-bbb9-f25807d73ed7","Type":"ContainerStarted","Data":"b236ca953306a5268fe180e22735623d773796e44739a8a279455de598cbf9ba"} Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.480837 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c89dc44dd-6ghsx" event={"ID":"964c839f-1077-4e46-bbb9-f25807d73ed7","Type":"ContainerStarted","Data":"69c9b684ffaeb7d688060f92ef52dc9bba085d6af86d8e8fb06ad4469bfbe7cc"} Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.480945 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.480962 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.483317 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4e2eb0f0-7643-448e-a97e-bd6551fe128e","Type":"ContainerStarted","Data":"34791672fffdc22625e3bd7f6740cae205e80d672dcc26415c81d995302ab19c"} Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.483464 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.500979 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c89dc44dd-6ghsx" podStartSLOduration=2.500960486 podStartE2EDuration="2.500960486s" podCreationTimestamp="2025-09-30 19:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:08.49774328 +0000 UTC m=+1061.697245400" watchObservedRunningTime="2025-09-30 19:50:08.500960486 +0000 UTC m=+1061.700462616" Sep 30 19:50:08 crc kubenswrapper[4553]: I0930 19:50:08.548874 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.548856747 podStartE2EDuration="3.548856747s" podCreationTimestamp="2025-09-30 19:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:08.546332029 +0000 UTC m=+1061.745834169" watchObservedRunningTime="2025-09-30 19:50:08.548856747 +0000 UTC m=+1061.748358877" Sep 30 19:50:09 crc kubenswrapper[4553]: I0930 19:50:09.747471 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 19:50:09 crc kubenswrapper[4553]: I0930 19:50:09.836911 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:50:10 crc kubenswrapper[4553]: I0930 19:50:10.508817 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerStarted","Data":"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2"} Sep 30 19:50:10 crc kubenswrapper[4553]: I0930 19:50:10.508910 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7e985377-9ff3-424f-9347-7841e2e60426" containerName="cinder-scheduler" containerID="cri-o://31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412" gracePeriod=30 Sep 30 19:50:10 crc kubenswrapper[4553]: I0930 19:50:10.509294 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7e985377-9ff3-424f-9347-7841e2e60426" containerName="probe" containerID="cri-o://a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea" gracePeriod=30 Sep 30 19:50:10 crc kubenswrapper[4553]: I0930 19:50:10.520460 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:50:10 crc kubenswrapper[4553]: I0930 19:50:10.533517 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.050056142 podStartE2EDuration="7.53349933s" podCreationTimestamp="2025-09-30 19:50:03 +0000 UTC" firstStartedPulling="2025-09-30 19:50:05.334519823 +0000 UTC m=+1058.534021953" lastFinishedPulling="2025-09-30 19:50:09.817962971 +0000 UTC m=+1063.017465141" observedRunningTime="2025-09-30 19:50:10.529295276 +0000 UTC m=+1063.728797406" watchObservedRunningTime="2025-09-30 19:50:10.53349933 +0000 UTC m=+1063.733001460" Sep 30 19:50:10 crc kubenswrapper[4553]: I0930 19:50:10.607813 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hrvtc"] Sep 30 19:50:10 crc kubenswrapper[4553]: I0930 19:50:10.608329 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" podUID="57ddc390-73c9-44d4-941d-63f506633035" containerName="dnsmasq-dns" containerID="cri-o://e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b" gracePeriod=10 Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.158325 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.268818 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-svc\") pod \"57ddc390-73c9-44d4-941d-63f506633035\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.268884 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-sb\") pod \"57ddc390-73c9-44d4-941d-63f506633035\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.269004 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fht2z\" (UniqueName: \"kubernetes.io/projected/57ddc390-73c9-44d4-941d-63f506633035-kube-api-access-fht2z\") pod \"57ddc390-73c9-44d4-941d-63f506633035\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.269028 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-swift-storage-0\") pod \"57ddc390-73c9-44d4-941d-63f506633035\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.269067 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-nb\") pod \"57ddc390-73c9-44d4-941d-63f506633035\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.269094 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-config\") pod \"57ddc390-73c9-44d4-941d-63f506633035\" (UID: \"57ddc390-73c9-44d4-941d-63f506633035\") " Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.295261 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ddc390-73c9-44d4-941d-63f506633035-kube-api-access-fht2z" (OuterVolumeSpecName: "kube-api-access-fht2z") pod "57ddc390-73c9-44d4-941d-63f506633035" (UID: "57ddc390-73c9-44d4-941d-63f506633035"). InnerVolumeSpecName "kube-api-access-fht2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.368482 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-config" (OuterVolumeSpecName: "config") pod "57ddc390-73c9-44d4-941d-63f506633035" (UID: "57ddc390-73c9-44d4-941d-63f506633035"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.371515 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fht2z\" (UniqueName: \"kubernetes.io/projected/57ddc390-73c9-44d4-941d-63f506633035-kube-api-access-fht2z\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.371542 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.378627 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "57ddc390-73c9-44d4-941d-63f506633035" (UID: "57ddc390-73c9-44d4-941d-63f506633035"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.394262 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "57ddc390-73c9-44d4-941d-63f506633035" (UID: "57ddc390-73c9-44d4-941d-63f506633035"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.409067 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57ddc390-73c9-44d4-941d-63f506633035" (UID: "57ddc390-73c9-44d4-941d-63f506633035"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.473625 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.473959 4553 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.473977 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.493621 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57ddc390-73c9-44d4-941d-63f506633035" (UID: "57ddc390-73c9-44d4-941d-63f506633035"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.560141 4553 generic.go:334] "Generic (PLEG): container finished" podID="57ddc390-73c9-44d4-941d-63f506633035" containerID="e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b" exitCode=0 Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.560317 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.560402 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" event={"ID":"57ddc390-73c9-44d4-941d-63f506633035","Type":"ContainerDied","Data":"e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b"} Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.560446 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hrvtc" event={"ID":"57ddc390-73c9-44d4-941d-63f506633035","Type":"ContainerDied","Data":"7def9f9829a84ad83184aff086ed82a1a1573db1a2eb8b45fce5f9c0bfb7b529"} Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.560982 4553 scope.go:117] "RemoveContainer" containerID="e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.561158 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.592466 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57ddc390-73c9-44d4-941d-63f506633035-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.612962 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hrvtc"] Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.624209 4553 scope.go:117] "RemoveContainer" containerID="49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.625744 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hrvtc"] Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.659441 4553 scope.go:117] "RemoveContainer" containerID="e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b" Sep 30 19:50:11 crc kubenswrapper[4553]: E0930 19:50:11.664453 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b\": container with ID starting with e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b not found: ID does not exist" containerID="e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.664489 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b"} err="failed to get container status \"e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b\": rpc error: code = NotFound desc = could not find container \"e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b\": container with ID starting with e5a1f4391ae1e636bc8cdaa2c000aaa4bdfe49fbb416df110c1e96bda396a10b not found: ID does not exist" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.664511 4553 scope.go:117] "RemoveContainer" containerID="49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e" Sep 30 19:50:11 crc kubenswrapper[4553]: E0930 19:50:11.664917 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e\": container with ID starting with 49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e not found: ID does not exist" containerID="49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e" Sep 30 19:50:11 crc kubenswrapper[4553]: I0930 19:50:11.664949 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e"} err="failed to get container status \"49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e\": rpc error: code = NotFound desc = could not find container \"49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e\": container with ID starting with 49a95dbb76b49c9171120de3ddf18a5111b09fa2e50f123a2335243c6a65d22e not found: ID does not exist" Sep 30 19:50:12 crc kubenswrapper[4553]: I0930 19:50:12.557475 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:12 crc kubenswrapper[4553]: I0930 19:50:12.565353 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:12 crc kubenswrapper[4553]: I0930 19:50:12.569835 4553 generic.go:334] "Generic (PLEG): container finished" podID="7e985377-9ff3-424f-9347-7841e2e60426" containerID="a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea" exitCode=0 Sep 30 19:50:12 crc kubenswrapper[4553]: I0930 19:50:12.569893 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7e985377-9ff3-424f-9347-7841e2e60426","Type":"ContainerDied","Data":"a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea"} Sep 30 19:50:13 crc kubenswrapper[4553]: I0930 19:50:13.522220 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ddc390-73c9-44d4-941d-63f506633035" path="/var/lib/kubelet/pods/57ddc390-73c9-44d4-941d-63f506633035/volumes" Sep 30 19:50:14 crc kubenswrapper[4553]: I0930 19:50:14.870905 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:50:14 crc kubenswrapper[4553]: I0930 19:50:14.925503 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:50:14 crc kubenswrapper[4553]: I0930 19:50:14.982889 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.563164 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.596860 4553 generic.go:334] "Generic (PLEG): container finished" podID="7e985377-9ff3-424f-9347-7841e2e60426" containerID="31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412" exitCode=0 Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.596910 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7e985377-9ff3-424f-9347-7841e2e60426","Type":"ContainerDied","Data":"31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412"} Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.596935 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7e985377-9ff3-424f-9347-7841e2e60426","Type":"ContainerDied","Data":"ee35be9dc417cf306eeb6980e6e8b3a52b8d1cbfe89afad55ef47e2560d2cb72"} Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.596952 4553 scope.go:117] "RemoveContainer" containerID="a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.597155 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.628512 4553 scope.go:117] "RemoveContainer" containerID="31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.680246 4553 scope.go:117] "RemoveContainer" containerID="a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea" Sep 30 19:50:15 crc kubenswrapper[4553]: E0930 19:50:15.684511 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea\": container with ID starting with a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea not found: ID does not exist" containerID="a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.684555 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea"} err="failed to get container status \"a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea\": rpc error: code = NotFound desc = could not find container \"a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea\": container with ID starting with a220980875d0864c84a4fa159c127b460515e5732ee722158313b9f3dde7d9ea not found: ID does not exist" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.684584 4553 scope.go:117] "RemoveContainer" containerID="31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412" Sep 30 19:50:15 crc kubenswrapper[4553]: E0930 19:50:15.685163 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412\": container with ID starting with 31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412 not found: ID does not exist" containerID="31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.685221 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412"} err="failed to get container status \"31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412\": rpc error: code = NotFound desc = could not find container \"31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412\": container with ID starting with 31e6803da6f9e9471ff4c623deef160c06221f0d6fc58a0bdfda8ff830d28412 not found: ID does not exist" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.692965 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-combined-ca-bundle\") pod \"7e985377-9ff3-424f-9347-7841e2e60426\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.693017 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn6fc\" (UniqueName: \"kubernetes.io/projected/7e985377-9ff3-424f-9347-7841e2e60426-kube-api-access-bn6fc\") pod \"7e985377-9ff3-424f-9347-7841e2e60426\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.693083 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data\") pod \"7e985377-9ff3-424f-9347-7841e2e60426\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.693160 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data-custom\") pod \"7e985377-9ff3-424f-9347-7841e2e60426\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.693224 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-scripts\") pod \"7e985377-9ff3-424f-9347-7841e2e60426\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.693338 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e985377-9ff3-424f-9347-7841e2e60426-etc-machine-id\") pod \"7e985377-9ff3-424f-9347-7841e2e60426\" (UID: \"7e985377-9ff3-424f-9347-7841e2e60426\") " Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.693708 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e985377-9ff3-424f-9347-7841e2e60426-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7e985377-9ff3-424f-9347-7841e2e60426" (UID: "7e985377-9ff3-424f-9347-7841e2e60426"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.710281 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7e985377-9ff3-424f-9347-7841e2e60426" (UID: "7e985377-9ff3-424f-9347-7841e2e60426"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.721782 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e985377-9ff3-424f-9347-7841e2e60426-kube-api-access-bn6fc" (OuterVolumeSpecName: "kube-api-access-bn6fc") pod "7e985377-9ff3-424f-9347-7841e2e60426" (UID: "7e985377-9ff3-424f-9347-7841e2e60426"). InnerVolumeSpecName "kube-api-access-bn6fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.722361 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-scripts" (OuterVolumeSpecName: "scripts") pod "7e985377-9ff3-424f-9347-7841e2e60426" (UID: "7e985377-9ff3-424f-9347-7841e2e60426"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.790171 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e985377-9ff3-424f-9347-7841e2e60426" (UID: "7e985377-9ff3-424f-9347-7841e2e60426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.797061 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.797091 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn6fc\" (UniqueName: \"kubernetes.io/projected/7e985377-9ff3-424f-9347-7841e2e60426-kube-api-access-bn6fc\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.797102 4553 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.797111 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.797119 4553 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e985377-9ff3-424f-9347-7841e2e60426-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.852524 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data" (OuterVolumeSpecName: "config-data") pod "7e985377-9ff3-424f-9347-7841e2e60426" (UID: "7e985377-9ff3-424f-9347-7841e2e60426"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.899321 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e985377-9ff3-424f-9347-7841e2e60426-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.932690 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.941586 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.960415 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:50:15 crc kubenswrapper[4553]: E0930 19:50:15.961123 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e985377-9ff3-424f-9347-7841e2e60426" containerName="cinder-scheduler" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.961238 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e985377-9ff3-424f-9347-7841e2e60426" containerName="cinder-scheduler" Sep 30 19:50:15 crc kubenswrapper[4553]: E0930 19:50:15.961331 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ddc390-73c9-44d4-941d-63f506633035" containerName="init" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.961405 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ddc390-73c9-44d4-941d-63f506633035" containerName="init" Sep 30 19:50:15 crc kubenswrapper[4553]: E0930 19:50:15.961487 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ddc390-73c9-44d4-941d-63f506633035" containerName="dnsmasq-dns" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.961563 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ddc390-73c9-44d4-941d-63f506633035" containerName="dnsmasq-dns" Sep 30 19:50:15 crc kubenswrapper[4553]: E0930 19:50:15.961651 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e985377-9ff3-424f-9347-7841e2e60426" containerName="probe" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.961720 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e985377-9ff3-424f-9347-7841e2e60426" containerName="probe" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.962310 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e985377-9ff3-424f-9347-7841e2e60426" containerName="probe" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.962435 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ddc390-73c9-44d4-941d-63f506633035" containerName="dnsmasq-dns" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.962522 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e985377-9ff3-424f-9347-7841e2e60426" containerName="cinder-scheduler" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.963901 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:50:15 crc kubenswrapper[4553]: I0930 19:50:15.993023 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.004499 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.107270 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnskm\" (UniqueName: \"kubernetes.io/projected/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-kube-api-access-dnskm\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.107344 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.107386 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-config-data\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.107415 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-scripts\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.107433 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.107463 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.208999 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.209210 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnskm\" (UniqueName: \"kubernetes.io/projected/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-kube-api-access-dnskm\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.209236 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.209566 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.209694 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-config-data\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.209767 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-scripts\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.209831 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.212532 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.213860 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-scripts\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.216572 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-config-data\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.217588 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.251878 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnskm\" (UniqueName: \"kubernetes.io/projected/6bc11f5f-706a-4984-ac99-d28ee3c2f8b5-kube-api-access-dnskm\") pod \"cinder-scheduler-0\" (UID: \"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5\") " pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.315464 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.858777 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.864369 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-89c84b54-dlsmt" Sep 30 19:50:16 crc kubenswrapper[4553]: I0930 19:50:16.875330 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 19:50:16 crc kubenswrapper[4553]: W0930 19:50:16.880891 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bc11f5f_706a_4984_ac99_d28ee3c2f8b5.slice/crio-d4ff227e956c6b28ec600fe8d5a797638abf05d0f9be6d1c107c93b639731f6e WatchSource:0}: Error finding container d4ff227e956c6b28ec600fe8d5a797638abf05d0f9be6d1c107c93b639731f6e: Status 404 returned error can't find the container with id d4ff227e956c6b28ec600fe8d5a797638abf05d0f9be6d1c107c93b639731f6e Sep 30 19:50:17 crc kubenswrapper[4553]: I0930 19:50:17.518067 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e985377-9ff3-424f-9347-7841e2e60426" path="/var/lib/kubelet/pods/7e985377-9ff3-424f-9347-7841e2e60426/volumes" Sep 30 19:50:17 crc kubenswrapper[4553]: I0930 19:50:17.652156 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5","Type":"ContainerStarted","Data":"d4ff227e956c6b28ec600fe8d5a797638abf05d0f9be6d1c107c93b639731f6e"} Sep 30 19:50:18 crc kubenswrapper[4553]: I0930 19:50:18.220346 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69bfb64645-4wbwh" Sep 30 19:50:18 crc kubenswrapper[4553]: I0930 19:50:18.289649 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-788545c8bb-gjsrq"] Sep 30 19:50:18 crc kubenswrapper[4553]: I0930 19:50:18.290159 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-788545c8bb-gjsrq" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerName="neutron-api" containerID="cri-o://47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86" gracePeriod=30 Sep 30 19:50:18 crc kubenswrapper[4553]: I0930 19:50:18.290810 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-788545c8bb-gjsrq" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerName="neutron-httpd" containerID="cri-o://4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0" gracePeriod=30 Sep 30 19:50:18 crc kubenswrapper[4553]: I0930 19:50:18.696867 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5","Type":"ContainerStarted","Data":"7c2cf6c000823633b9b525f29eed3429a78a7bebd0c8aab5b29fac3c723cd373"} Sep 30 19:50:19 crc kubenswrapper[4553]: I0930 19:50:19.621501 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-578c97db4-g464k" Sep 30 19:50:19 crc kubenswrapper[4553]: I0930 19:50:19.710737 4553 generic.go:334] "Generic (PLEG): container finished" podID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerID="4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0" exitCode=0 Sep 30 19:50:19 crc kubenswrapper[4553]: I0930 19:50:19.710820 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-788545c8bb-gjsrq" event={"ID":"f6470fe1-f2c0-454b-a534-b183258da4f3","Type":"ContainerDied","Data":"4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0"} Sep 30 19:50:19 crc kubenswrapper[4553]: I0930 19:50:19.715707 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6bc11f5f-706a-4984-ac99-d28ee3c2f8b5","Type":"ContainerStarted","Data":"158fbd1e892b994bb41540f4b9353e6e1ae5057fe0285f575d990935bad6df1c"} Sep 30 19:50:19 crc kubenswrapper[4553]: I0930 19:50:19.744182 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.744165625 podStartE2EDuration="4.744165625s" podCreationTimestamp="2025-09-30 19:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:19.739232961 +0000 UTC m=+1072.938735091" watchObservedRunningTime="2025-09-30 19:50:19.744165625 +0000 UTC m=+1072.943667755" Sep 30 19:50:19 crc kubenswrapper[4553]: I0930 19:50:19.868291 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="4e2eb0f0-7643-448e-a97e-bd6551fe128e" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.167:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:50:19 crc kubenswrapper[4553]: I0930 19:50:19.878227 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-868c6b469d-rhw7t" podUID="849f4ec8-2741-4c83-82d8-135a24b43447" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:50:19 crc kubenswrapper[4553]: I0930 19:50:19.943628 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:50:20 crc kubenswrapper[4553]: I0930 19:50:20.475082 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 19:50:20 crc kubenswrapper[4553]: I0930 19:50:20.759029 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:21 crc kubenswrapper[4553]: I0930 19:50:21.122607 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c89dc44dd-6ghsx" Sep 30 19:50:21 crc kubenswrapper[4553]: I0930 19:50:21.210919 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-775f8d6cf8-484p6"] Sep 30 19:50:21 crc kubenswrapper[4553]: I0930 19:50:21.211160 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-775f8d6cf8-484p6" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" containerName="barbican-api-log" containerID="cri-o://9f7fb7190b1cd7fd6d2551d47e1560078d150ed76ec77d76fb59d43cf8c70f2d" gracePeriod=30 Sep 30 19:50:21 crc kubenswrapper[4553]: I0930 19:50:21.211580 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-775f8d6cf8-484p6" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" containerName="barbican-api" containerID="cri-o://b3a516f1382bbca1abbebf5f177c7424716a45fe2ad25608e0e2bbfbe597bdae" gracePeriod=30 Sep 30 19:50:21 crc kubenswrapper[4553]: I0930 19:50:21.316137 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 19:50:21 crc kubenswrapper[4553]: I0930 19:50:21.731755 4553 generic.go:334] "Generic (PLEG): container finished" podID="fa486e18-3471-4970-b3b3-495d02626b6d" containerID="9f7fb7190b1cd7fd6d2551d47e1560078d150ed76ec77d76fb59d43cf8c70f2d" exitCode=143 Sep 30 19:50:21 crc kubenswrapper[4553]: I0930 19:50:21.732666 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-775f8d6cf8-484p6" event={"ID":"fa486e18-3471-4970-b3b3-495d02626b6d","Type":"ContainerDied","Data":"9f7fb7190b1cd7fd6d2551d47e1560078d150ed76ec77d76fb59d43cf8c70f2d"} Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.487793 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.573398 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-ovndb-tls-certs\") pod \"f6470fe1-f2c0-454b-a534-b183258da4f3\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.573511 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-config\") pod \"f6470fe1-f2c0-454b-a534-b183258da4f3\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.573583 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-combined-ca-bundle\") pod \"f6470fe1-f2c0-454b-a534-b183258da4f3\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.573635 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-httpd-config\") pod \"f6470fe1-f2c0-454b-a534-b183258da4f3\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.573707 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7sdp\" (UniqueName: \"kubernetes.io/projected/f6470fe1-f2c0-454b-a534-b183258da4f3-kube-api-access-l7sdp\") pod \"f6470fe1-f2c0-454b-a534-b183258da4f3\" (UID: \"f6470fe1-f2c0-454b-a534-b183258da4f3\") " Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.583219 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f6470fe1-f2c0-454b-a534-b183258da4f3" (UID: "f6470fe1-f2c0-454b-a534-b183258da4f3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.590210 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6470fe1-f2c0-454b-a534-b183258da4f3-kube-api-access-l7sdp" (OuterVolumeSpecName: "kube-api-access-l7sdp") pod "f6470fe1-f2c0-454b-a534-b183258da4f3" (UID: "f6470fe1-f2c0-454b-a534-b183258da4f3"). InnerVolumeSpecName "kube-api-access-l7sdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.677100 4553 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.677137 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7sdp\" (UniqueName: \"kubernetes.io/projected/f6470fe1-f2c0-454b-a534-b183258da4f3-kube-api-access-l7sdp\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.689180 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-config" (OuterVolumeSpecName: "config") pod "f6470fe1-f2c0-454b-a534-b183258da4f3" (UID: "f6470fe1-f2c0-454b-a534-b183258da4f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.691149 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6470fe1-f2c0-454b-a534-b183258da4f3" (UID: "f6470fe1-f2c0-454b-a534-b183258da4f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.709178 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f6470fe1-f2c0-454b-a534-b183258da4f3" (UID: "f6470fe1-f2c0-454b-a534-b183258da4f3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.759974 4553 generic.go:334] "Generic (PLEG): container finished" podID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerID="47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86" exitCode=0 Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.760026 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-788545c8bb-gjsrq" event={"ID":"f6470fe1-f2c0-454b-a534-b183258da4f3","Type":"ContainerDied","Data":"47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86"} Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.760073 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-788545c8bb-gjsrq" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.760088 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-788545c8bb-gjsrq" event={"ID":"f6470fe1-f2c0-454b-a534-b183258da4f3","Type":"ContainerDied","Data":"f73f03b3cc48db377b74a4c0319380b2fba329c7797c741a264f0f1693622c5a"} Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.760110 4553 scope.go:117] "RemoveContainer" containerID="4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.778392 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.778419 4553 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.778429 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6470fe1-f2c0-454b-a534-b183258da4f3-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.812291 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-788545c8bb-gjsrq"] Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.819850 4553 scope.go:117] "RemoveContainer" containerID="47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.824815 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-788545c8bb-gjsrq"] Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.855274 4553 scope.go:117] "RemoveContainer" containerID="4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0" Sep 30 19:50:22 crc kubenswrapper[4553]: E0930 19:50:22.855652 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0\": container with ID starting with 4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0 not found: ID does not exist" containerID="4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.855685 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0"} err="failed to get container status \"4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0\": rpc error: code = NotFound desc = could not find container \"4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0\": container with ID starting with 4db9fe1cb580b76aef401c6111b9b0e7cd40466a010e247cd4038438a16e81b0 not found: ID does not exist" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.855704 4553 scope.go:117] "RemoveContainer" containerID="47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86" Sep 30 19:50:22 crc kubenswrapper[4553]: E0930 19:50:22.856019 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86\": container with ID starting with 47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86 not found: ID does not exist" containerID="47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86" Sep 30 19:50:22 crc kubenswrapper[4553]: I0930 19:50:22.856169 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86"} err="failed to get container status \"47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86\": rpc error: code = NotFound desc = could not find container \"47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86\": container with ID starting with 47cf2bc3d8c56520af6dc56901328b152380dbba2848ee3f7b5837ec7f101f86 not found: ID does not exist" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.516119 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" path="/var/lib/kubelet/pods/f6470fe1-f2c0-454b-a534-b183258da4f3/volumes" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.574335 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 19:50:23 crc kubenswrapper[4553]: E0930 19:50:23.574797 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerName="neutron-api" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.574822 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerName="neutron-api" Sep 30 19:50:23 crc kubenswrapper[4553]: E0930 19:50:23.574846 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerName="neutron-httpd" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.574856 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerName="neutron-httpd" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.575177 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerName="neutron-api" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.575211 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6470fe1-f2c0-454b-a534-b183258da4f3" containerName="neutron-httpd" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.575963 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.577568 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.577963 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.581784 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-lznw4" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.594782 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.693200 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdtcl\" (UniqueName: \"kubernetes.io/projected/b81cd96a-9a9f-4334-8512-34fb38da918f-kube-api-access-cdtcl\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.693263 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b81cd96a-9a9f-4334-8512-34fb38da918f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.693321 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b81cd96a-9a9f-4334-8512-34fb38da918f-openstack-config\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.693549 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81cd96a-9a9f-4334-8512-34fb38da918f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.795491 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b81cd96a-9a9f-4334-8512-34fb38da918f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.795560 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b81cd96a-9a9f-4334-8512-34fb38da918f-openstack-config\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.795630 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81cd96a-9a9f-4334-8512-34fb38da918f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.795713 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdtcl\" (UniqueName: \"kubernetes.io/projected/b81cd96a-9a9f-4334-8512-34fb38da918f-kube-api-access-cdtcl\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.796578 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b81cd96a-9a9f-4334-8512-34fb38da918f-openstack-config\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.807824 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b81cd96a-9a9f-4334-8512-34fb38da918f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.830954 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b81cd96a-9a9f-4334-8512-34fb38da918f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.840615 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdtcl\" (UniqueName: \"kubernetes.io/projected/b81cd96a-9a9f-4334-8512-34fb38da918f-kube-api-access-cdtcl\") pod \"openstackclient\" (UID: \"b81cd96a-9a9f-4334-8512-34fb38da918f\") " pod="openstack/openstackclient" Sep 30 19:50:23 crc kubenswrapper[4553]: I0930 19:50:23.890154 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.523587 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.581614 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-868c6b469d-rhw7t" Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.657189 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84c849768b-8k9mh"] Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.659656 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon-log" containerID="cri-o://433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d" gracePeriod=30 Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.660102 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" containerID="cri-o://865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7" gracePeriod=30 Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.682764 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.683070 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.776541 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b81cd96a-9a9f-4334-8512-34fb38da918f","Type":"ContainerStarted","Data":"2c22385977b8346bfaebd8e963025498913748e0cfbb43ff4cf9f10678766ed0"} Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.778561 4553 generic.go:334] "Generic (PLEG): container finished" podID="fa486e18-3471-4970-b3b3-495d02626b6d" containerID="b3a516f1382bbca1abbebf5f177c7424716a45fe2ad25608e0e2bbfbe597bdae" exitCode=0 Sep 30 19:50:24 crc kubenswrapper[4553]: I0930 19:50:24.778605 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-775f8d6cf8-484p6" event={"ID":"fa486e18-3471-4970-b3b3-495d02626b6d","Type":"ContainerDied","Data":"b3a516f1382bbca1abbebf5f177c7424716a45fe2ad25608e0e2bbfbe597bdae"} Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.346804 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.458420 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data\") pod \"fa486e18-3471-4970-b3b3-495d02626b6d\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.458583 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sfnl\" (UniqueName: \"kubernetes.io/projected/fa486e18-3471-4970-b3b3-495d02626b6d-kube-api-access-5sfnl\") pod \"fa486e18-3471-4970-b3b3-495d02626b6d\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.458671 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa486e18-3471-4970-b3b3-495d02626b6d-logs\") pod \"fa486e18-3471-4970-b3b3-495d02626b6d\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.458726 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-combined-ca-bundle\") pod \"fa486e18-3471-4970-b3b3-495d02626b6d\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.458825 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data-custom\") pod \"fa486e18-3471-4970-b3b3-495d02626b6d\" (UID: \"fa486e18-3471-4970-b3b3-495d02626b6d\") " Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.459520 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa486e18-3471-4970-b3b3-495d02626b6d-logs" (OuterVolumeSpecName: "logs") pod "fa486e18-3471-4970-b3b3-495d02626b6d" (UID: "fa486e18-3471-4970-b3b3-495d02626b6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.465357 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fa486e18-3471-4970-b3b3-495d02626b6d" (UID: "fa486e18-3471-4970-b3b3-495d02626b6d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.465400 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa486e18-3471-4970-b3b3-495d02626b6d-kube-api-access-5sfnl" (OuterVolumeSpecName: "kube-api-access-5sfnl") pod "fa486e18-3471-4970-b3b3-495d02626b6d" (UID: "fa486e18-3471-4970-b3b3-495d02626b6d"). InnerVolumeSpecName "kube-api-access-5sfnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.491762 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa486e18-3471-4970-b3b3-495d02626b6d" (UID: "fa486e18-3471-4970-b3b3-495d02626b6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.529061 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data" (OuterVolumeSpecName: "config-data") pod "fa486e18-3471-4970-b3b3-495d02626b6d" (UID: "fa486e18-3471-4970-b3b3-495d02626b6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.560729 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sfnl\" (UniqueName: \"kubernetes.io/projected/fa486e18-3471-4970-b3b3-495d02626b6d-kube-api-access-5sfnl\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.560761 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa486e18-3471-4970-b3b3-495d02626b6d-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.560771 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.560781 4553 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.560792 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa486e18-3471-4970-b3b3-495d02626b6d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.790780 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-775f8d6cf8-484p6" event={"ID":"fa486e18-3471-4970-b3b3-495d02626b6d","Type":"ContainerDied","Data":"8db8dc70fa1d633022f260144132246e1b915ff79b392131017f234c050e4396"} Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.790838 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-775f8d6cf8-484p6" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.790871 4553 scope.go:117] "RemoveContainer" containerID="b3a516f1382bbca1abbebf5f177c7424716a45fe2ad25608e0e2bbfbe597bdae" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.813663 4553 scope.go:117] "RemoveContainer" containerID="9f7fb7190b1cd7fd6d2551d47e1560078d150ed76ec77d76fb59d43cf8c70f2d" Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.828753 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-775f8d6cf8-484p6"] Sep 30 19:50:25 crc kubenswrapper[4553]: I0930 19:50:25.835793 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-775f8d6cf8-484p6"] Sep 30 19:50:26 crc kubenswrapper[4553]: I0930 19:50:26.557930 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.517155 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" path="/var/lib/kubelet/pods/fa486e18-3471-4970-b3b3-495d02626b6d/volumes" Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.654568 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.654820 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="ceilometer-central-agent" containerID="cri-o://1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511" gracePeriod=30 Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.654918 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="ceilometer-notification-agent" containerID="cri-o://b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b" gracePeriod=30 Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.654898 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="sg-core" containerID="cri-o://075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e" gracePeriod=30 Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.655057 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="proxy-httpd" containerID="cri-o://9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2" gracePeriod=30 Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.672693 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.166:3000/\": EOF" Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.837875 4553 generic.go:334] "Generic (PLEG): container finished" podID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerID="075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e" exitCode=2 Sep 30 19:50:27 crc kubenswrapper[4553]: I0930 19:50:27.837939 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerDied","Data":"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e"} Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.616215 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-74bb65c547-lfcd8"] Sep 30 19:50:28 crc kubenswrapper[4553]: E0930 19:50:28.616757 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" containerName="barbican-api" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.616770 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" containerName="barbican-api" Sep 30 19:50:28 crc kubenswrapper[4553]: E0930 19:50:28.616784 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" containerName="barbican-api-log" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.616790 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" containerName="barbican-api-log" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.616969 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" containerName="barbican-api-log" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.616981 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa486e18-3471-4970-b3b3-495d02626b6d" containerName="barbican-api" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.618135 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.623363 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.625742 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.625883 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.655846 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74bb65c547-lfcd8"] Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.661964 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717222 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-combined-ca-bundle\") pod \"063b4634-4c30-4ddd-ab2e-2c238126621e\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717362 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h2t9\" (UniqueName: \"kubernetes.io/projected/063b4634-4c30-4ddd-ab2e-2c238126621e-kube-api-access-4h2t9\") pod \"063b4634-4c30-4ddd-ab2e-2c238126621e\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717429 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-config-data\") pod \"063b4634-4c30-4ddd-ab2e-2c238126621e\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717457 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-log-httpd\") pod \"063b4634-4c30-4ddd-ab2e-2c238126621e\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717524 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-run-httpd\") pod \"063b4634-4c30-4ddd-ab2e-2c238126621e\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717539 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-scripts\") pod \"063b4634-4c30-4ddd-ab2e-2c238126621e\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717595 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-sg-core-conf-yaml\") pod \"063b4634-4c30-4ddd-ab2e-2c238126621e\" (UID: \"063b4634-4c30-4ddd-ab2e-2c238126621e\") " Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717818 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-internal-tls-certs\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717871 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8354628-b925-4289-9c68-7038dd4b2064-log-httpd\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717917 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8354628-b925-4289-9c68-7038dd4b2064-etc-swift\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717960 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-config-data\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.717978 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsrdh\" (UniqueName: \"kubernetes.io/projected/d8354628-b925-4289-9c68-7038dd4b2064-kube-api-access-xsrdh\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.718015 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-combined-ca-bundle\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.718067 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-public-tls-certs\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.718089 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8354628-b925-4289-9c68-7038dd4b2064-run-httpd\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.718569 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "063b4634-4c30-4ddd-ab2e-2c238126621e" (UID: "063b4634-4c30-4ddd-ab2e-2c238126621e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.718589 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "063b4634-4c30-4ddd-ab2e-2c238126621e" (UID: "063b4634-4c30-4ddd-ab2e-2c238126621e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.731270 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-scripts" (OuterVolumeSpecName: "scripts") pod "063b4634-4c30-4ddd-ab2e-2c238126621e" (UID: "063b4634-4c30-4ddd-ab2e-2c238126621e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.732142 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/063b4634-4c30-4ddd-ab2e-2c238126621e-kube-api-access-4h2t9" (OuterVolumeSpecName: "kube-api-access-4h2t9") pod "063b4634-4c30-4ddd-ab2e-2c238126621e" (UID: "063b4634-4c30-4ddd-ab2e-2c238126621e"). InnerVolumeSpecName "kube-api-access-4h2t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.758695 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "063b4634-4c30-4ddd-ab2e-2c238126621e" (UID: "063b4634-4c30-4ddd-ab2e-2c238126621e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823021 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-config-data\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823095 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsrdh\" (UniqueName: \"kubernetes.io/projected/d8354628-b925-4289-9c68-7038dd4b2064-kube-api-access-xsrdh\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823130 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-combined-ca-bundle\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823166 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-public-tls-certs\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823185 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8354628-b925-4289-9c68-7038dd4b2064-run-httpd\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823213 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-internal-tls-certs\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823247 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8354628-b925-4289-9c68-7038dd4b2064-log-httpd\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823290 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8354628-b925-4289-9c68-7038dd4b2064-etc-swift\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823339 4553 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823350 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823359 4553 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823369 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h2t9\" (UniqueName: \"kubernetes.io/projected/063b4634-4c30-4ddd-ab2e-2c238126621e-kube-api-access-4h2t9\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.823377 4553 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/063b4634-4c30-4ddd-ab2e-2c238126621e-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.828552 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-public-tls-certs\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.828972 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8354628-b925-4289-9c68-7038dd4b2064-run-httpd\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.829350 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8354628-b925-4289-9c68-7038dd4b2064-log-httpd\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.832881 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-config-data\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.834568 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "063b4634-4c30-4ddd-ab2e-2c238126621e" (UID: "063b4634-4c30-4ddd-ab2e-2c238126621e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.836747 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-combined-ca-bundle\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.837309 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8354628-b925-4289-9c68-7038dd4b2064-internal-tls-certs\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.841951 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d8354628-b925-4289-9c68-7038dd4b2064-etc-swift\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.842369 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsrdh\" (UniqueName: \"kubernetes.io/projected/d8354628-b925-4289-9c68-7038dd4b2064-kube-api-access-xsrdh\") pod \"swift-proxy-74bb65c547-lfcd8\" (UID: \"d8354628-b925-4289-9c68-7038dd4b2064\") " pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.866143 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-config-data" (OuterVolumeSpecName: "config-data") pod "063b4634-4c30-4ddd-ab2e-2c238126621e" (UID: "063b4634-4c30-4ddd-ab2e-2c238126621e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872515 4553 generic.go:334] "Generic (PLEG): container finished" podID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerID="9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2" exitCode=0 Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872548 4553 generic.go:334] "Generic (PLEG): container finished" podID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerID="b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b" exitCode=0 Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872558 4553 generic.go:334] "Generic (PLEG): container finished" podID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerID="1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511" exitCode=0 Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872579 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerDied","Data":"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2"} Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872605 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerDied","Data":"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b"} Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872614 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerDied","Data":"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511"} Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872623 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"063b4634-4c30-4ddd-ab2e-2c238126621e","Type":"ContainerDied","Data":"73346d129b1264e1112e76b2003efdd193384e9cb883ebe76badf1d31387c3a6"} Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872637 4553 scope.go:117] "RemoveContainer" containerID="9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.872766 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.925285 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.925317 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063b4634-4c30-4ddd-ab2e-2c238126621e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.945445 4553 scope.go:117] "RemoveContainer" containerID="075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.952684 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.955401 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.972173 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.987454 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:28 crc kubenswrapper[4553]: E0930 19:50:28.987858 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="proxy-httpd" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.987874 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="proxy-httpd" Sep 30 19:50:28 crc kubenswrapper[4553]: E0930 19:50:28.987885 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="ceilometer-central-agent" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.987890 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="ceilometer-central-agent" Sep 30 19:50:28 crc kubenswrapper[4553]: E0930 19:50:28.987897 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="ceilometer-notification-agent" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.987903 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="ceilometer-notification-agent" Sep 30 19:50:28 crc kubenswrapper[4553]: E0930 19:50:28.987937 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="sg-core" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.987945 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="sg-core" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.988128 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="ceilometer-notification-agent" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.988139 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="sg-core" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.988157 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="ceilometer-central-agent" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.988166 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" containerName="proxy-httpd" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.989827 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:28 crc kubenswrapper[4553]: I0930 19:50:28.995676 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.018823 4553 scope.go:117] "RemoveContainer" containerID="b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.019213 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.019416 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.101509 4553 scope.go:117] "RemoveContainer" containerID="1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.128151 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8nx2\" (UniqueName: \"kubernetes.io/projected/8188090a-3304-47b3-9f57-936cfa9db056-kube-api-access-s8nx2\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.128203 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-config-data\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.128229 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-scripts\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.128258 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-log-httpd\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.128289 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-run-httpd\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.128304 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.128380 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.135766 4553 scope.go:117] "RemoveContainer" containerID="9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2" Sep 30 19:50:29 crc kubenswrapper[4553]: E0930 19:50:29.136568 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2\": container with ID starting with 9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2 not found: ID does not exist" containerID="9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.136598 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2"} err="failed to get container status \"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2\": rpc error: code = NotFound desc = could not find container \"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2\": container with ID starting with 9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2 not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.136621 4553 scope.go:117] "RemoveContainer" containerID="075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e" Sep 30 19:50:29 crc kubenswrapper[4553]: E0930 19:50:29.137219 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e\": container with ID starting with 075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e not found: ID does not exist" containerID="075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.137241 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e"} err="failed to get container status \"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e\": rpc error: code = NotFound desc = could not find container \"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e\": container with ID starting with 075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.137256 4553 scope.go:117] "RemoveContainer" containerID="b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b" Sep 30 19:50:29 crc kubenswrapper[4553]: E0930 19:50:29.137895 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b\": container with ID starting with b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b not found: ID does not exist" containerID="b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.137915 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b"} err="failed to get container status \"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b\": rpc error: code = NotFound desc = could not find container \"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b\": container with ID starting with b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.137927 4553 scope.go:117] "RemoveContainer" containerID="1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511" Sep 30 19:50:29 crc kubenswrapper[4553]: E0930 19:50:29.139532 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511\": container with ID starting with 1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511 not found: ID does not exist" containerID="1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.139558 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511"} err="failed to get container status \"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511\": rpc error: code = NotFound desc = could not find container \"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511\": container with ID starting with 1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511 not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.139578 4553 scope.go:117] "RemoveContainer" containerID="9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.141335 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2"} err="failed to get container status \"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2\": rpc error: code = NotFound desc = could not find container \"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2\": container with ID starting with 9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2 not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.141358 4553 scope.go:117] "RemoveContainer" containerID="075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.144156 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e"} err="failed to get container status \"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e\": rpc error: code = NotFound desc = could not find container \"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e\": container with ID starting with 075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.144178 4553 scope.go:117] "RemoveContainer" containerID="b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.144845 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b"} err="failed to get container status \"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b\": rpc error: code = NotFound desc = could not find container \"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b\": container with ID starting with b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.144866 4553 scope.go:117] "RemoveContainer" containerID="1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.145204 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511"} err="failed to get container status \"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511\": rpc error: code = NotFound desc = could not find container \"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511\": container with ID starting with 1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511 not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.145238 4553 scope.go:117] "RemoveContainer" containerID="9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.145548 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2"} err="failed to get container status \"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2\": rpc error: code = NotFound desc = could not find container \"9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2\": container with ID starting with 9ebd94faa267d2b176dbf9c7a875ad6675174d5d86c20b7641173035968decb2 not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.145568 4553 scope.go:117] "RemoveContainer" containerID="075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.146056 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e"} err="failed to get container status \"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e\": rpc error: code = NotFound desc = could not find container \"075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e\": container with ID starting with 075e625eaf65b1c3d764330ef5c8154daa649866e527a3c69a2e0ed7666de71e not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.146076 4553 scope.go:117] "RemoveContainer" containerID="b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.146416 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b"} err="failed to get container status \"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b\": rpc error: code = NotFound desc = could not find container \"b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b\": container with ID starting with b4d6f82f1289463c5c0495c22ae4e1429b07a836b7e22ea1ad99d5bdb127c99b not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.146434 4553 scope.go:117] "RemoveContainer" containerID="1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.146787 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511"} err="failed to get container status \"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511\": rpc error: code = NotFound desc = could not find container \"1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511\": container with ID starting with 1ff0aa70a57b666bcfc3ae5d6ce65b9ae87ba5e03a0945aef5c2d015b8b79511 not found: ID does not exist" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.229813 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-run-httpd\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.229854 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.229936 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.229975 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8nx2\" (UniqueName: \"kubernetes.io/projected/8188090a-3304-47b3-9f57-936cfa9db056-kube-api-access-s8nx2\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.230001 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-config-data\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.230023 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-scripts\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.230056 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-log-httpd\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.230448 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-log-httpd\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.230620 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-run-httpd\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.238946 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-config-data\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.239533 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.239799 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.239812 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-scripts\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.275759 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8nx2\" (UniqueName: \"kubernetes.io/projected/8188090a-3304-47b3-9f57-936cfa9db056-kube-api-access-s8nx2\") pod \"ceilometer-0\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.372819 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.460567 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74bb65c547-lfcd8"] Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.518260 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="063b4634-4c30-4ddd-ab2e-2c238126621e" path="/var/lib/kubelet/pods/063b4634-4c30-4ddd-ab2e-2c238126621e/volumes" Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.886898 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.904079 4553 generic.go:334] "Generic (PLEG): container finished" podID="17921f25-bee1-4e2e-a9e2-50669133664e" containerID="865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7" exitCode=0 Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.904151 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c849768b-8k9mh" event={"ID":"17921f25-bee1-4e2e-a9e2-50669133664e","Type":"ContainerDied","Data":"865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7"} Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.922807 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74bb65c547-lfcd8" event={"ID":"d8354628-b925-4289-9c68-7038dd4b2064","Type":"ContainerStarted","Data":"9f5b178f3db0265b54bd193d92a11ff8790220fabba58eeb9b8a26c9ab4fe79a"} Sep 30 19:50:29 crc kubenswrapper[4553]: I0930 19:50:29.922848 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74bb65c547-lfcd8" event={"ID":"d8354628-b925-4289-9c68-7038dd4b2064","Type":"ContainerStarted","Data":"b1d17e3110f10e9ff4f1f7e7e83181e46dd9b4d37fcf617179971648b48084d8"} Sep 30 19:50:30 crc kubenswrapper[4553]: I0930 19:50:30.932598 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74bb65c547-lfcd8" event={"ID":"d8354628-b925-4289-9c68-7038dd4b2064","Type":"ContainerStarted","Data":"4357628b13f8f1daa0a839f29f596551883fe039e4f113d3e5f2a7c2f6a93741"} Sep 30 19:50:30 crc kubenswrapper[4553]: I0930 19:50:30.933109 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:30 crc kubenswrapper[4553]: I0930 19:50:30.934405 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerStarted","Data":"fa2b48401a7c83e19637b17c97d2ef1564f0bdeb6eef47aca4b713e44e083a4a"} Sep 30 19:50:30 crc kubenswrapper[4553]: I0930 19:50:30.967504 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-74bb65c547-lfcd8" podStartSLOduration=2.967486645 podStartE2EDuration="2.967486645s" podCreationTimestamp="2025-09-30 19:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:30.960270471 +0000 UTC m=+1084.159772601" watchObservedRunningTime="2025-09-30 19:50:30.967486645 +0000 UTC m=+1084.166988775" Sep 30 19:50:31 crc kubenswrapper[4553]: I0930 19:50:31.427536 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Sep 30 19:50:31 crc kubenswrapper[4553]: I0930 19:50:31.949252 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:32 crc kubenswrapper[4553]: I0930 19:50:32.039383 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:32 crc kubenswrapper[4553]: I0930 19:50:32.249200 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:50:32 crc kubenswrapper[4553]: I0930 19:50:32.249456 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-log" containerID="cri-o://5245f0f62b37906870ac327c579cdf7896ac5667eec53691486f374e74c80ba6" gracePeriod=30 Sep 30 19:50:32 crc kubenswrapper[4553]: I0930 19:50:32.249519 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-httpd" containerID="cri-o://27bcd308b1b3b96ab1caae008d19f9236f5222fe9841d2ca6606cb6543edbaad" gracePeriod=30 Sep 30 19:50:32 crc kubenswrapper[4553]: I0930 19:50:32.960460 4553 generic.go:334] "Generic (PLEG): container finished" podID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerID="5245f0f62b37906870ac327c579cdf7896ac5667eec53691486f374e74c80ba6" exitCode=143 Sep 30 19:50:32 crc kubenswrapper[4553]: I0930 19:50:32.960651 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125","Type":"ContainerDied","Data":"5245f0f62b37906870ac327c579cdf7896ac5667eec53691486f374e74c80ba6"} Sep 30 19:50:33 crc kubenswrapper[4553]: I0930 19:50:33.579454 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:50:33 crc kubenswrapper[4553]: I0930 19:50:33.579768 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-log" containerID="cri-o://1886f660e393872d040f7bb9a8675d9af5153af6883480cd52e1543eff8c3e7e" gracePeriod=30 Sep 30 19:50:33 crc kubenswrapper[4553]: I0930 19:50:33.580002 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-httpd" containerID="cri-o://fae9530c48673a42c1b2c5cff236adc67b416ff8342ccaeb00fc52112e0c160e" gracePeriod=30 Sep 30 19:50:33 crc kubenswrapper[4553]: I0930 19:50:33.982635 4553 generic.go:334] "Generic (PLEG): container finished" podID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerID="1886f660e393872d040f7bb9a8675d9af5153af6883480cd52e1543eff8c3e7e" exitCode=143 Sep 30 19:50:33 crc kubenswrapper[4553]: I0930 19:50:33.982698 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b75f5b7-0080-4f75-9012-c89c87d08202","Type":"ContainerDied","Data":"1886f660e393872d040f7bb9a8675d9af5153af6883480cd52e1543eff8c3e7e"} Sep 30 19:50:35 crc kubenswrapper[4553]: I0930 19:50:35.425320 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.152:9292/healthcheck\": read tcp 10.217.0.2:36466->10.217.0.152:9292: read: connection reset by peer" Sep 30 19:50:35 crc kubenswrapper[4553]: I0930 19:50:35.425613 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9292/healthcheck\": read tcp 10.217.0.2:36464->10.217.0.152:9292: read: connection reset by peer" Sep 30 19:50:36 crc kubenswrapper[4553]: I0930 19:50:36.001442 4553 generic.go:334] "Generic (PLEG): container finished" podID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerID="27bcd308b1b3b96ab1caae008d19f9236f5222fe9841d2ca6606cb6543edbaad" exitCode=0 Sep 30 19:50:36 crc kubenswrapper[4553]: I0930 19:50:36.001483 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125","Type":"ContainerDied","Data":"27bcd308b1b3b96ab1caae008d19f9236f5222fe9841d2ca6606cb6543edbaad"} Sep 30 19:50:36 crc kubenswrapper[4553]: I0930 19:50:36.774630 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:57842->10.217.0.153:9292: read: connection reset by peer" Sep 30 19:50:36 crc kubenswrapper[4553]: I0930 19:50:36.775107 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.153:9292/healthcheck\": read tcp 10.217.0.2:57856->10.217.0.153:9292: read: connection reset by peer" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.016428 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerStarted","Data":"8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6"} Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.022450 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b81cd96a-9a9f-4334-8512-34fb38da918f","Type":"ContainerStarted","Data":"0f9ff7e8b548fe5068ebe488e431ba28f0cd8516ba14766a74f7096691315eb3"} Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.027140 4553 generic.go:334] "Generic (PLEG): container finished" podID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerID="fae9530c48673a42c1b2c5cff236adc67b416ff8342ccaeb00fc52112e0c160e" exitCode=0 Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.027184 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b75f5b7-0080-4f75-9012-c89c87d08202","Type":"ContainerDied","Data":"fae9530c48673a42c1b2c5cff236adc67b416ff8342ccaeb00fc52112e0c160e"} Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.141546 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.191547 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.038418519 podStartE2EDuration="14.191513471s" podCreationTimestamp="2025-09-30 19:50:23 +0000 UTC" firstStartedPulling="2025-09-30 19:50:24.546226707 +0000 UTC m=+1077.745728837" lastFinishedPulling="2025-09-30 19:50:36.699321659 +0000 UTC m=+1089.898823789" observedRunningTime="2025-09-30 19:50:37.043261747 +0000 UTC m=+1090.242763877" watchObservedRunningTime="2025-09-30 19:50:37.191513471 +0000 UTC m=+1090.391015601" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.193000 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.193045 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-logs\") pod \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.193113 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-scripts\") pod \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.193227 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-httpd-run\") pod \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.193266 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-public-tls-certs\") pod \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.193284 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwnts\" (UniqueName: \"kubernetes.io/projected/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-kube-api-access-lwnts\") pod \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.193313 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-combined-ca-bundle\") pod \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.193386 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-config-data\") pod \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\" (UID: \"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.195024 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" (UID: "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.198555 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" (UID: "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.198812 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-logs" (OuterVolumeSpecName: "logs") pod "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" (UID: "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.200634 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-kube-api-access-lwnts" (OuterVolumeSpecName: "kube-api-access-lwnts") pod "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" (UID: "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125"). InnerVolumeSpecName "kube-api-access-lwnts". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.203164 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-scripts" (OuterVolumeSpecName: "scripts") pod "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" (UID: "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.252069 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" (UID: "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.255284 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" (UID: "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.297609 4553 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.297634 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.297645 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.297652 4553 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.297660 4553 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.297669 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwnts\" (UniqueName: \"kubernetes.io/projected/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-kube-api-access-lwnts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.297677 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.338210 4553 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.344387 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.354639 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-config-data" (OuterVolumeSpecName: "config-data") pod "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" (UID: "5c5a2be1-2db1-44cb-9ee2-a46ee27fe125"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.399213 4553 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.399245 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.500680 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-logs\") pod \"8b75f5b7-0080-4f75-9012-c89c87d08202\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.500730 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-config-data\") pod \"8b75f5b7-0080-4f75-9012-c89c87d08202\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.500752 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-httpd-run\") pod \"8b75f5b7-0080-4f75-9012-c89c87d08202\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.500776 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8b75f5b7-0080-4f75-9012-c89c87d08202\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.500868 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-scripts\") pod \"8b75f5b7-0080-4f75-9012-c89c87d08202\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.500892 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-combined-ca-bundle\") pod \"8b75f5b7-0080-4f75-9012-c89c87d08202\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.500911 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czfrm\" (UniqueName: \"kubernetes.io/projected/8b75f5b7-0080-4f75-9012-c89c87d08202-kube-api-access-czfrm\") pod \"8b75f5b7-0080-4f75-9012-c89c87d08202\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.500931 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-internal-tls-certs\") pod \"8b75f5b7-0080-4f75-9012-c89c87d08202\" (UID: \"8b75f5b7-0080-4f75-9012-c89c87d08202\") " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.504356 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-logs" (OuterVolumeSpecName: "logs") pod "8b75f5b7-0080-4f75-9012-c89c87d08202" (UID: "8b75f5b7-0080-4f75-9012-c89c87d08202"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.504698 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8b75f5b7-0080-4f75-9012-c89c87d08202" (UID: "8b75f5b7-0080-4f75-9012-c89c87d08202"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.512660 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-scripts" (OuterVolumeSpecName: "scripts") pod "8b75f5b7-0080-4f75-9012-c89c87d08202" (UID: "8b75f5b7-0080-4f75-9012-c89c87d08202"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.516141 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "8b75f5b7-0080-4f75-9012-c89c87d08202" (UID: "8b75f5b7-0080-4f75-9012-c89c87d08202"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.523578 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b75f5b7-0080-4f75-9012-c89c87d08202-kube-api-access-czfrm" (OuterVolumeSpecName: "kube-api-access-czfrm") pod "8b75f5b7-0080-4f75-9012-c89c87d08202" (UID: "8b75f5b7-0080-4f75-9012-c89c87d08202"). InnerVolumeSpecName "kube-api-access-czfrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.589350 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-config-data" (OuterVolumeSpecName: "config-data") pod "8b75f5b7-0080-4f75-9012-c89c87d08202" (UID: "8b75f5b7-0080-4f75-9012-c89c87d08202"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.600519 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b75f5b7-0080-4f75-9012-c89c87d08202" (UID: "8b75f5b7-0080-4f75-9012-c89c87d08202"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.603720 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.603788 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.603798 4553 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8b75f5b7-0080-4f75-9012-c89c87d08202-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.603840 4553 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.603849 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.603984 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.604000 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czfrm\" (UniqueName: \"kubernetes.io/projected/8b75f5b7-0080-4f75-9012-c89c87d08202-kube-api-access-czfrm\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.635718 4553 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.642315 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8b75f5b7-0080-4f75-9012-c89c87d08202" (UID: "8b75f5b7-0080-4f75-9012-c89c87d08202"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.705643 4553 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:37 crc kubenswrapper[4553]: I0930 19:50:37.705674 4553 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b75f5b7-0080-4f75-9012-c89c87d08202-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.036349 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5c5a2be1-2db1-44cb-9ee2-a46ee27fe125","Type":"ContainerDied","Data":"3946f228f64355eeb6ceb0b10cd22babf305c008dd3c43635c21b3efe1c2e95d"} Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.036398 4553 scope.go:117] "RemoveContainer" containerID="27bcd308b1b3b96ab1caae008d19f9236f5222fe9841d2ca6606cb6543edbaad" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.036432 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.039664 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.039959 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8b75f5b7-0080-4f75-9012-c89c87d08202","Type":"ContainerDied","Data":"1d432502db72c94df0b6fe2abe249d7c72155fcd3ca318a75f1c6d8da0a64be5"} Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.045552 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerStarted","Data":"a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082"} Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.067451 4553 scope.go:117] "RemoveContainer" containerID="5245f0f62b37906870ac327c579cdf7896ac5667eec53691486f374e74c80ba6" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.075061 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.092779 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.092808 4553 scope.go:117] "RemoveContainer" containerID="fae9530c48673a42c1b2c5cff236adc67b416ff8342ccaeb00fc52112e0c160e" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.113890 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:50:38 crc kubenswrapper[4553]: E0930 19:50:38.114294 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-httpd" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.114310 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-httpd" Sep 30 19:50:38 crc kubenswrapper[4553]: E0930 19:50:38.114330 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-httpd" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.114337 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-httpd" Sep 30 19:50:38 crc kubenswrapper[4553]: E0930 19:50:38.114345 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-log" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.114351 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-log" Sep 30 19:50:38 crc kubenswrapper[4553]: E0930 19:50:38.114374 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-log" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.114379 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-log" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.114535 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-log" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.114553 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-httpd" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.114561 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" containerName="glance-httpd" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.114576 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" containerName="glance-log" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.115464 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.122550 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zv45w" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.122831 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.122955 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.123288 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.130240 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.133893 4553 scope.go:117] "RemoveContainer" containerID="1886f660e393872d040f7bb9a8675d9af5153af6883480cd52e1543eff8c3e7e" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.148186 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.176603 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.194547 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.196398 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.201624 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.201870 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.214637 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.216927 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.216980 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.217104 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.217194 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.217227 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-logs\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.217403 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.217426 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.217455 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tsq\" (UniqueName: \"kubernetes.io/projected/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-kube-api-access-s7tsq\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318387 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318449 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318481 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318500 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318528 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7tsq\" (UniqueName: \"kubernetes.io/projected/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-kube-api-access-s7tsq\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318548 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bfz4\" (UniqueName: \"kubernetes.io/projected/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-kube-api-access-7bfz4\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318570 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318588 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318614 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318636 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318656 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318683 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318707 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318725 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318753 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.318772 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-logs\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.319182 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-logs\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.319429 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.319628 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.324587 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.328669 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.347409 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.348713 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.351330 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7tsq\" (UniqueName: \"kubernetes.io/projected/4f78bdf3-c263-41a0-9594-85b8c5b0dcd0-kube-api-access-s7tsq\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.377801 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0\") " pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422066 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422129 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bfz4\" (UniqueName: \"kubernetes.io/projected/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-kube-api-access-7bfz4\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422154 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422181 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422207 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422237 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422261 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422312 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.422983 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.423169 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.423203 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.426415 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.429442 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.435898 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.436337 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.446024 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bfz4\" (UniqueName: \"kubernetes.io/projected/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-kube-api-access-7bfz4\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.447257 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7456ce4-1ac9-4e11-9fc2-f8680cacd86a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.458576 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a\") " pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.537383 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.963218 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:38 crc kubenswrapper[4553]: I0930 19:50:38.965336 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74bb65c547-lfcd8" Sep 30 19:50:39 crc kubenswrapper[4553]: I0930 19:50:39.104791 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerStarted","Data":"1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e"} Sep 30 19:50:39 crc kubenswrapper[4553]: I0930 19:50:39.155064 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 19:50:39 crc kubenswrapper[4553]: I0930 19:50:39.245763 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 19:50:39 crc kubenswrapper[4553]: I0930 19:50:39.517094 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c5a2be1-2db1-44cb-9ee2-a46ee27fe125" path="/var/lib/kubelet/pods/5c5a2be1-2db1-44cb-9ee2-a46ee27fe125/volumes" Sep 30 19:50:39 crc kubenswrapper[4553]: I0930 19:50:39.523288 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b75f5b7-0080-4f75-9012-c89c87d08202" path="/var/lib/kubelet/pods/8b75f5b7-0080-4f75-9012-c89c87d08202/volumes" Sep 30 19:50:40 crc kubenswrapper[4553]: I0930 19:50:40.141549 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a","Type":"ContainerStarted","Data":"c0a859a496e2f7fcda9fff58f247d36014779b5836290f65e27159cf8d6bedf2"} Sep 30 19:50:40 crc kubenswrapper[4553]: I0930 19:50:40.141775 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a","Type":"ContainerStarted","Data":"3e448f9d187fee5f81c6852313186f647822b93f2d0e1732d137cd4d0af57b78"} Sep 30 19:50:40 crc kubenswrapper[4553]: I0930 19:50:40.142995 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0","Type":"ContainerStarted","Data":"65e1cefc71e5a9f22dec886dade5f801457a1e8a4015724e2b72bf62e267a3ad"} Sep 30 19:50:40 crc kubenswrapper[4553]: I0930 19:50:40.143013 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0","Type":"ContainerStarted","Data":"111007b2323074aa0b61b2212e34f83150fb6baba90b5be36decad1c58009c69"} Sep 30 19:50:41 crc kubenswrapper[4553]: I0930 19:50:41.178285 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b7456ce4-1ac9-4e11-9fc2-f8680cacd86a","Type":"ContainerStarted","Data":"7b60f8606d2695a63c8397c5c53f74c5a40a88868dd5ebba4f88d3e0feaf18cb"} Sep 30 19:50:41 crc kubenswrapper[4553]: I0930 19:50:41.188974 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f78bdf3-c263-41a0-9594-85b8c5b0dcd0","Type":"ContainerStarted","Data":"a8c222da9c89f5e5f4000478079761cc6c01ed874d6a284444d5d4b7be9f0185"} Sep 30 19:50:41 crc kubenswrapper[4553]: I0930 19:50:41.198063 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.198047429 podStartE2EDuration="3.198047429s" podCreationTimestamp="2025-09-30 19:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:41.195430709 +0000 UTC m=+1094.394932839" watchObservedRunningTime="2025-09-30 19:50:41.198047429 +0000 UTC m=+1094.397549559" Sep 30 19:50:41 crc kubenswrapper[4553]: I0930 19:50:41.222818 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.222800016 podStartE2EDuration="3.222800016s" podCreationTimestamp="2025-09-30 19:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:41.217691498 +0000 UTC m=+1094.417193638" watchObservedRunningTime="2025-09-30 19:50:41.222800016 +0000 UTC m=+1094.422302146" Sep 30 19:50:41 crc kubenswrapper[4553]: I0930 19:50:41.427909 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Sep 30 19:50:41 crc kubenswrapper[4553]: I0930 19:50:41.428330 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:50:42 crc kubenswrapper[4553]: I0930 19:50:42.198217 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerStarted","Data":"62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d"} Sep 30 19:50:42 crc kubenswrapper[4553]: I0930 19:50:42.198590 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="ceilometer-central-agent" containerID="cri-o://8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6" gracePeriod=30 Sep 30 19:50:42 crc kubenswrapper[4553]: I0930 19:50:42.198664 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="ceilometer-notification-agent" containerID="cri-o://a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082" gracePeriod=30 Sep 30 19:50:42 crc kubenswrapper[4553]: I0930 19:50:42.198737 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="sg-core" containerID="cri-o://1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e" gracePeriod=30 Sep 30 19:50:42 crc kubenswrapper[4553]: I0930 19:50:42.198839 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="proxy-httpd" containerID="cri-o://62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d" gracePeriod=30 Sep 30 19:50:42 crc kubenswrapper[4553]: I0930 19:50:42.222796 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.08526744 podStartE2EDuration="14.222779059s" podCreationTimestamp="2025-09-30 19:50:28 +0000 UTC" firstStartedPulling="2025-09-30 19:50:29.919279424 +0000 UTC m=+1083.118781554" lastFinishedPulling="2025-09-30 19:50:41.056791033 +0000 UTC m=+1094.256293173" observedRunningTime="2025-09-30 19:50:42.21762415 +0000 UTC m=+1095.417126280" watchObservedRunningTime="2025-09-30 19:50:42.222779059 +0000 UTC m=+1095.422281189" Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.214697 4553 generic.go:334] "Generic (PLEG): container finished" podID="8188090a-3304-47b3-9f57-936cfa9db056" containerID="62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d" exitCode=0 Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.215079 4553 generic.go:334] "Generic (PLEG): container finished" podID="8188090a-3304-47b3-9f57-936cfa9db056" containerID="1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e" exitCode=2 Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.215089 4553 generic.go:334] "Generic (PLEG): container finished" podID="8188090a-3304-47b3-9f57-936cfa9db056" containerID="a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082" exitCode=0 Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.214777 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerDied","Data":"62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d"} Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.215128 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerDied","Data":"1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e"} Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.215142 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerDied","Data":"a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082"} Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.820344 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.942827 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-config-data\") pod \"8188090a-3304-47b3-9f57-936cfa9db056\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.942896 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-log-httpd\") pod \"8188090a-3304-47b3-9f57-936cfa9db056\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.942940 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8nx2\" (UniqueName: \"kubernetes.io/projected/8188090a-3304-47b3-9f57-936cfa9db056-kube-api-access-s8nx2\") pod \"8188090a-3304-47b3-9f57-936cfa9db056\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.943028 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-run-httpd\") pod \"8188090a-3304-47b3-9f57-936cfa9db056\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.943126 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-scripts\") pod \"8188090a-3304-47b3-9f57-936cfa9db056\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.943191 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-combined-ca-bundle\") pod \"8188090a-3304-47b3-9f57-936cfa9db056\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.943229 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-sg-core-conf-yaml\") pod \"8188090a-3304-47b3-9f57-936cfa9db056\" (UID: \"8188090a-3304-47b3-9f57-936cfa9db056\") " Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.943314 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8188090a-3304-47b3-9f57-936cfa9db056" (UID: "8188090a-3304-47b3-9f57-936cfa9db056"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.943789 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8188090a-3304-47b3-9f57-936cfa9db056" (UID: "8188090a-3304-47b3-9f57-936cfa9db056"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.944073 4553 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.944089 4553 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8188090a-3304-47b3-9f57-936cfa9db056-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.960227 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8188090a-3304-47b3-9f57-936cfa9db056-kube-api-access-s8nx2" (OuterVolumeSpecName: "kube-api-access-s8nx2") pod "8188090a-3304-47b3-9f57-936cfa9db056" (UID: "8188090a-3304-47b3-9f57-936cfa9db056"). InnerVolumeSpecName "kube-api-access-s8nx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.974591 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-scripts" (OuterVolumeSpecName: "scripts") pod "8188090a-3304-47b3-9f57-936cfa9db056" (UID: "8188090a-3304-47b3-9f57-936cfa9db056"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:43 crc kubenswrapper[4553]: I0930 19:50:43.974859 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8188090a-3304-47b3-9f57-936cfa9db056" (UID: "8188090a-3304-47b3-9f57-936cfa9db056"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.034851 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8188090a-3304-47b3-9f57-936cfa9db056" (UID: "8188090a-3304-47b3-9f57-936cfa9db056"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.046113 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.046288 4553 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.046360 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8nx2\" (UniqueName: \"kubernetes.io/projected/8188090a-3304-47b3-9f57-936cfa9db056-kube-api-access-s8nx2\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.046429 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.060471 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-config-data" (OuterVolumeSpecName: "config-data") pod "8188090a-3304-47b3-9f57-936cfa9db056" (UID: "8188090a-3304-47b3-9f57-936cfa9db056"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.147609 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8188090a-3304-47b3-9f57-936cfa9db056-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.226222 4553 generic.go:334] "Generic (PLEG): container finished" podID="8188090a-3304-47b3-9f57-936cfa9db056" containerID="8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6" exitCode=0 Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.226265 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerDied","Data":"8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6"} Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.226292 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8188090a-3304-47b3-9f57-936cfa9db056","Type":"ContainerDied","Data":"fa2b48401a7c83e19637b17c97d2ef1564f0bdeb6eef47aca4b713e44e083a4a"} Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.226312 4553 scope.go:117] "RemoveContainer" containerID="62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.227245 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.247653 4553 scope.go:117] "RemoveContainer" containerID="1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.254892 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.265432 4553 scope.go:117] "RemoveContainer" containerID="a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.272674 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281218 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.281552 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="ceilometer-notification-agent" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281568 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="ceilometer-notification-agent" Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.281578 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="sg-core" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281586 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="sg-core" Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.281623 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="ceilometer-central-agent" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281629 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="ceilometer-central-agent" Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.281640 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="proxy-httpd" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281645 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="proxy-httpd" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281807 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="ceilometer-notification-agent" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281825 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="sg-core" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281840 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="proxy-httpd" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.281853 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8188090a-3304-47b3-9f57-936cfa9db056" containerName="ceilometer-central-agent" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.283389 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.287714 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.287731 4553 scope.go:117] "RemoveContainer" containerID="8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.288231 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.316746 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.323666 4553 scope.go:117] "RemoveContainer" containerID="62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d" Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.324206 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d\": container with ID starting with 62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d not found: ID does not exist" containerID="62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.324235 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d"} err="failed to get container status \"62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d\": rpc error: code = NotFound desc = could not find container \"62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d\": container with ID starting with 62bef8e9a91f1cacf8d8dcf9245835f6499799b58f2b5d8339c195143dbe017d not found: ID does not exist" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.324256 4553 scope.go:117] "RemoveContainer" containerID="1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e" Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.324687 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e\": container with ID starting with 1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e not found: ID does not exist" containerID="1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.324707 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e"} err="failed to get container status \"1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e\": rpc error: code = NotFound desc = could not find container \"1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e\": container with ID starting with 1bbec1f251c73174a60ea59e71c7641038f3705720119454f69716fa8f20664e not found: ID does not exist" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.324720 4553 scope.go:117] "RemoveContainer" containerID="a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082" Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.324971 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082\": container with ID starting with a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082 not found: ID does not exist" containerID="a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.324992 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082"} err="failed to get container status \"a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082\": rpc error: code = NotFound desc = could not find container \"a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082\": container with ID starting with a16b67f1d49498083ac75b162f2d8ccc96b8cfe8c6ebd8d883d5e06aa6901082 not found: ID does not exist" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.325005 4553 scope.go:117] "RemoveContainer" containerID="8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6" Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.325372 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6\": container with ID starting with 8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6 not found: ID does not exist" containerID="8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.325399 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6"} err="failed to get container status \"8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6\": rpc error: code = NotFound desc = could not find container \"8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6\": container with ID starting with 8d2e72d63569a6c6c3794f5121b7a6d5a605eb6179b019353eb1095df1e848d6 not found: ID does not exist" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.351984 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-scripts\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.352032 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.352077 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7j7m\" (UniqueName: \"kubernetes.io/projected/0473d309-242c-4f24-a47b-f8b459aa6394-kube-api-access-b7j7m\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.352216 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-config-data\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.352286 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.352340 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-run-httpd\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.352358 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-log-httpd\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.366633 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:44 crc kubenswrapper[4553]: E0930 19:50:44.367285 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-b7j7m log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="0473d309-242c-4f24-a47b-f8b459aa6394" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.454261 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-scripts\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.454294 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.454330 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7j7m\" (UniqueName: \"kubernetes.io/projected/0473d309-242c-4f24-a47b-f8b459aa6394-kube-api-access-b7j7m\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.454361 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-config-data\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.454383 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.454406 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-run-httpd\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.454420 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-log-httpd\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.454871 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-log-httpd\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.455369 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-run-httpd\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.458458 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.458539 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-scripts\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.459473 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.462732 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-config-data\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:44 crc kubenswrapper[4553]: I0930 19:50:44.469902 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7j7m\" (UniqueName: \"kubernetes.io/projected/0473d309-242c-4f24-a47b-f8b459aa6394-kube-api-access-b7j7m\") pod \"ceilometer-0\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " pod="openstack/ceilometer-0" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.237069 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.249208 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.369748 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-sg-core-conf-yaml\") pod \"0473d309-242c-4f24-a47b-f8b459aa6394\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.369849 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-config-data\") pod \"0473d309-242c-4f24-a47b-f8b459aa6394\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.369877 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-scripts\") pod \"0473d309-242c-4f24-a47b-f8b459aa6394\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.369956 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7j7m\" (UniqueName: \"kubernetes.io/projected/0473d309-242c-4f24-a47b-f8b459aa6394-kube-api-access-b7j7m\") pod \"0473d309-242c-4f24-a47b-f8b459aa6394\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.369987 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-log-httpd\") pod \"0473d309-242c-4f24-a47b-f8b459aa6394\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.370100 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-run-httpd\") pod \"0473d309-242c-4f24-a47b-f8b459aa6394\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.370165 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-combined-ca-bundle\") pod \"0473d309-242c-4f24-a47b-f8b459aa6394\" (UID: \"0473d309-242c-4f24-a47b-f8b459aa6394\") " Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.374309 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0473d309-242c-4f24-a47b-f8b459aa6394" (UID: "0473d309-242c-4f24-a47b-f8b459aa6394"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.374562 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0473d309-242c-4f24-a47b-f8b459aa6394" (UID: "0473d309-242c-4f24-a47b-f8b459aa6394"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.380610 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-config-data" (OuterVolumeSpecName: "config-data") pod "0473d309-242c-4f24-a47b-f8b459aa6394" (UID: "0473d309-242c-4f24-a47b-f8b459aa6394"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.382322 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0473d309-242c-4f24-a47b-f8b459aa6394-kube-api-access-b7j7m" (OuterVolumeSpecName: "kube-api-access-b7j7m") pod "0473d309-242c-4f24-a47b-f8b459aa6394" (UID: "0473d309-242c-4f24-a47b-f8b459aa6394"). InnerVolumeSpecName "kube-api-access-b7j7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.383541 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-scripts" (OuterVolumeSpecName: "scripts") pod "0473d309-242c-4f24-a47b-f8b459aa6394" (UID: "0473d309-242c-4f24-a47b-f8b459aa6394"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.388388 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0473d309-242c-4f24-a47b-f8b459aa6394" (UID: "0473d309-242c-4f24-a47b-f8b459aa6394"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.389722 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0473d309-242c-4f24-a47b-f8b459aa6394" (UID: "0473d309-242c-4f24-a47b-f8b459aa6394"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.472145 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.472170 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.472180 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7j7m\" (UniqueName: \"kubernetes.io/projected/0473d309-242c-4f24-a47b-f8b459aa6394-kube-api-access-b7j7m\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.472189 4553 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.472197 4553 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0473d309-242c-4f24-a47b-f8b459aa6394-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.472205 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.472214 4553 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0473d309-242c-4f24-a47b-f8b459aa6394-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:45 crc kubenswrapper[4553]: I0930 19:50:45.514420 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8188090a-3304-47b3-9f57-936cfa9db056" path="/var/lib/kubelet/pods/8188090a-3304-47b3-9f57-936cfa9db056/volumes" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.244817 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.309913 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.319683 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.325271 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.327286 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.330797 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.330989 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.342074 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.388310 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-config-data\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.388348 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-run-httpd\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.388403 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-log-httpd\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.388436 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-scripts\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.388548 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67n9f\" (UniqueName: \"kubernetes.io/projected/e89116ff-9090-4e93-b28b-e2782c16f192-kube-api-access-67n9f\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.388683 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.388789 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.490296 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-log-httpd\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.491196 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-scripts\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.491429 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67n9f\" (UniqueName: \"kubernetes.io/projected/e89116ff-9090-4e93-b28b-e2782c16f192-kube-api-access-67n9f\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.491607 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.491788 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.491925 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-config-data\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.492023 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-run-httpd\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.492428 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-run-httpd\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.491103 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-log-httpd\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.497975 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-config-data\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.503440 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.506617 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-scripts\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.512425 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67n9f\" (UniqueName: \"kubernetes.io/projected/e89116ff-9090-4e93-b28b-e2782c16f192-kube-api-access-67n9f\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.513439 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " pod="openstack/ceilometer-0" Sep 30 19:50:46 crc kubenswrapper[4553]: I0930 19:50:46.667862 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:47 crc kubenswrapper[4553]: I0930 19:50:47.096023 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:47 crc kubenswrapper[4553]: W0930 19:50:47.097758 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode89116ff_9090_4e93_b28b_e2782c16f192.slice/crio-71883a66518f9895fe58df3c4da4dcd8cf958cd7f0ed6c820a15d7c5f98a7e21 WatchSource:0}: Error finding container 71883a66518f9895fe58df3c4da4dcd8cf958cd7f0ed6c820a15d7c5f98a7e21: Status 404 returned error can't find the container with id 71883a66518f9895fe58df3c4da4dcd8cf958cd7f0ed6c820a15d7c5f98a7e21 Sep 30 19:50:47 crc kubenswrapper[4553]: I0930 19:50:47.255778 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerStarted","Data":"71883a66518f9895fe58df3c4da4dcd8cf958cd7f0ed6c820a15d7c5f98a7e21"} Sep 30 19:50:47 crc kubenswrapper[4553]: I0930 19:50:47.520136 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0473d309-242c-4f24-a47b-f8b459aa6394" path="/var/lib/kubelet/pods/0473d309-242c-4f24-a47b-f8b459aa6394/volumes" Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.282770 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerStarted","Data":"ab1a5ac675df971f36bece18ca185ad186b788f58ed257c02be4797583517d01"} Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.438003 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.438263 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.465454 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.473631 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.538977 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.539056 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.578601 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:48 crc kubenswrapper[4553]: I0930 19:50:48.580441 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:49 crc kubenswrapper[4553]: I0930 19:50:49.299230 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerStarted","Data":"761da65a56b1a70f926d9aa211683057be02e99ed87dfb3b5ad4d0b7b2d15399"} Sep 30 19:50:49 crc kubenswrapper[4553]: I0930 19:50:49.299515 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 19:50:49 crc kubenswrapper[4553]: I0930 19:50:49.301311 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:49 crc kubenswrapper[4553]: I0930 19:50:49.301335 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 19:50:49 crc kubenswrapper[4553]: I0930 19:50:49.301343 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:50 crc kubenswrapper[4553]: I0930 19:50:50.306970 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerStarted","Data":"03fa552d053a24845cfb7f4ec12f3a6073480d03333f46d970c161cd27b80abb"} Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.315841 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerStarted","Data":"00ee1f33b4c828339a4de7443e2130e156cb0ef654faf400a67a9a5a20a393fe"} Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.316430 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.315933 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.316452 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.315874 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.316553 4553 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.336824 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.534140447 podStartE2EDuration="5.336804444s" podCreationTimestamp="2025-09-30 19:50:46 +0000 UTC" firstStartedPulling="2025-09-30 19:50:47.100701256 +0000 UTC m=+1100.300203376" lastFinishedPulling="2025-09-30 19:50:50.903365243 +0000 UTC m=+1104.102867373" observedRunningTime="2025-09-30 19:50:51.333809524 +0000 UTC m=+1104.533311654" watchObservedRunningTime="2025-09-30 19:50:51.336804444 +0000 UTC m=+1104.536306574" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.428648 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c849768b-8k9mh" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.454716 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.521706 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.739614 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 19:50:51 crc kubenswrapper[4553]: I0930 19:50:51.756520 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 19:50:54 crc kubenswrapper[4553]: I0930 19:50:54.533107 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:54 crc kubenswrapper[4553]: I0930 19:50:54.533807 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="ceilometer-central-agent" containerID="cri-o://ab1a5ac675df971f36bece18ca185ad186b788f58ed257c02be4797583517d01" gracePeriod=30 Sep 30 19:50:54 crc kubenswrapper[4553]: I0930 19:50:54.533887 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="proxy-httpd" containerID="cri-o://00ee1f33b4c828339a4de7443e2130e156cb0ef654faf400a67a9a5a20a393fe" gracePeriod=30 Sep 30 19:50:54 crc kubenswrapper[4553]: I0930 19:50:54.533932 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="ceilometer-notification-agent" containerID="cri-o://761da65a56b1a70f926d9aa211683057be02e99ed87dfb3b5ad4d0b7b2d15399" gracePeriod=30 Sep 30 19:50:54 crc kubenswrapper[4553]: I0930 19:50:54.533915 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="sg-core" containerID="cri-o://03fa552d053a24845cfb7f4ec12f3a6073480d03333f46d970c161cd27b80abb" gracePeriod=30 Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.086886 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.206116 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xm47s"] Sep 30 19:50:55 crc kubenswrapper[4553]: E0930 19:50:55.206455 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon-log" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.206470 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon-log" Sep 30 19:50:55 crc kubenswrapper[4553]: E0930 19:50:55.206505 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.206511 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.206662 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.206679 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" containerName="horizon-log" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.207311 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xm47s" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.219023 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xm47s"] Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.274211 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rshk\" (UniqueName: \"kubernetes.io/projected/17921f25-bee1-4e2e-a9e2-50669133664e-kube-api-access-9rshk\") pod \"17921f25-bee1-4e2e-a9e2-50669133664e\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.274577 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-combined-ca-bundle\") pod \"17921f25-bee1-4e2e-a9e2-50669133664e\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.274613 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-scripts\") pod \"17921f25-bee1-4e2e-a9e2-50669133664e\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.274634 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-config-data\") pod \"17921f25-bee1-4e2e-a9e2-50669133664e\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.274650 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-secret-key\") pod \"17921f25-bee1-4e2e-a9e2-50669133664e\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.274679 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17921f25-bee1-4e2e-a9e2-50669133664e-logs\") pod \"17921f25-bee1-4e2e-a9e2-50669133664e\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.274704 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-tls-certs\") pod \"17921f25-bee1-4e2e-a9e2-50669133664e\" (UID: \"17921f25-bee1-4e2e-a9e2-50669133664e\") " Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.275074 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-976k6\" (UniqueName: \"kubernetes.io/projected/b972cf08-0eee-4970-8825-a313fdddc23a-kube-api-access-976k6\") pod \"nova-api-db-create-xm47s\" (UID: \"b972cf08-0eee-4970-8825-a313fdddc23a\") " pod="openstack/nova-api-db-create-xm47s" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.275521 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17921f25-bee1-4e2e-a9e2-50669133664e-logs" (OuterVolumeSpecName: "logs") pod "17921f25-bee1-4e2e-a9e2-50669133664e" (UID: "17921f25-bee1-4e2e-a9e2-50669133664e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.284236 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "17921f25-bee1-4e2e-a9e2-50669133664e" (UID: "17921f25-bee1-4e2e-a9e2-50669133664e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.284276 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17921f25-bee1-4e2e-a9e2-50669133664e-kube-api-access-9rshk" (OuterVolumeSpecName: "kube-api-access-9rshk") pod "17921f25-bee1-4e2e-a9e2-50669133664e" (UID: "17921f25-bee1-4e2e-a9e2-50669133664e"). InnerVolumeSpecName "kube-api-access-9rshk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.309331 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-config-data" (OuterVolumeSpecName: "config-data") pod "17921f25-bee1-4e2e-a9e2-50669133664e" (UID: "17921f25-bee1-4e2e-a9e2-50669133664e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.325192 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17921f25-bee1-4e2e-a9e2-50669133664e" (UID: "17921f25-bee1-4e2e-a9e2-50669133664e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.325639 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-scripts" (OuterVolumeSpecName: "scripts") pod "17921f25-bee1-4e2e-a9e2-50669133664e" (UID: "17921f25-bee1-4e2e-a9e2-50669133664e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.343546 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "17921f25-bee1-4e2e-a9e2-50669133664e" (UID: "17921f25-bee1-4e2e-a9e2-50669133664e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.367269 4553 generic.go:334] "Generic (PLEG): container finished" podID="e89116ff-9090-4e93-b28b-e2782c16f192" containerID="00ee1f33b4c828339a4de7443e2130e156cb0ef654faf400a67a9a5a20a393fe" exitCode=0 Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.367302 4553 generic.go:334] "Generic (PLEG): container finished" podID="e89116ff-9090-4e93-b28b-e2782c16f192" containerID="03fa552d053a24845cfb7f4ec12f3a6073480d03333f46d970c161cd27b80abb" exitCode=2 Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.367309 4553 generic.go:334] "Generic (PLEG): container finished" podID="e89116ff-9090-4e93-b28b-e2782c16f192" containerID="761da65a56b1a70f926d9aa211683057be02e99ed87dfb3b5ad4d0b7b2d15399" exitCode=0 Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.367352 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerDied","Data":"00ee1f33b4c828339a4de7443e2130e156cb0ef654faf400a67a9a5a20a393fe"} Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.367376 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerDied","Data":"03fa552d053a24845cfb7f4ec12f3a6073480d03333f46d970c161cd27b80abb"} Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.367386 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerDied","Data":"761da65a56b1a70f926d9aa211683057be02e99ed87dfb3b5ad4d0b7b2d15399"} Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.369542 4553 generic.go:334] "Generic (PLEG): container finished" podID="17921f25-bee1-4e2e-a9e2-50669133664e" containerID="433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d" exitCode=137 Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.369572 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c849768b-8k9mh" event={"ID":"17921f25-bee1-4e2e-a9e2-50669133664e","Type":"ContainerDied","Data":"433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d"} Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.369583 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c849768b-8k9mh" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.369606 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c849768b-8k9mh" event={"ID":"17921f25-bee1-4e2e-a9e2-50669133664e","Type":"ContainerDied","Data":"8ed9ddd1b071890d8c64603020476049c182c88d421c1634793e232667dd10a9"} Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.369629 4553 scope.go:117] "RemoveContainer" containerID="865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.376140 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-976k6\" (UniqueName: \"kubernetes.io/projected/b972cf08-0eee-4970-8825-a313fdddc23a-kube-api-access-976k6\") pod \"nova-api-db-create-xm47s\" (UID: \"b972cf08-0eee-4970-8825-a313fdddc23a\") " pod="openstack/nova-api-db-create-xm47s" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.376280 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rshk\" (UniqueName: \"kubernetes.io/projected/17921f25-bee1-4e2e-a9e2-50669133664e-kube-api-access-9rshk\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.376302 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.376314 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.376326 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17921f25-bee1-4e2e-a9e2-50669133664e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.376336 4553 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.376347 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17921f25-bee1-4e2e-a9e2-50669133664e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.376358 4553 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/17921f25-bee1-4e2e-a9e2-50669133664e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.404965 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-976k6\" (UniqueName: \"kubernetes.io/projected/b972cf08-0eee-4970-8825-a313fdddc23a-kube-api-access-976k6\") pod \"nova-api-db-create-xm47s\" (UID: \"b972cf08-0eee-4970-8825-a313fdddc23a\") " pod="openstack/nova-api-db-create-xm47s" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.412778 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f7n8l"] Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.414457 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f7n8l" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.433634 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f7n8l"] Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.474560 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84c849768b-8k9mh"] Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.489549 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84c849768b-8k9mh"] Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.528724 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17921f25-bee1-4e2e-a9e2-50669133664e" path="/var/lib/kubelet/pods/17921f25-bee1-4e2e-a9e2-50669133664e/volumes" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.534519 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5p4j4"] Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.537014 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5p4j4" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.541713 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5p4j4"] Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.579180 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7nw\" (UniqueName: \"kubernetes.io/projected/78299d5b-ca49-4bfa-a23e-c81671ab07da-kube-api-access-qf7nw\") pod \"nova-cell0-db-create-f7n8l\" (UID: \"78299d5b-ca49-4bfa-a23e-c81671ab07da\") " pod="openstack/nova-cell0-db-create-f7n8l" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.589702 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xm47s" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.629139 4553 scope.go:117] "RemoveContainer" containerID="433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.660900 4553 scope.go:117] "RemoveContainer" containerID="865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7" Sep 30 19:50:55 crc kubenswrapper[4553]: E0930 19:50:55.661530 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7\": container with ID starting with 865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7 not found: ID does not exist" containerID="865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.661650 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7"} err="failed to get container status \"865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7\": rpc error: code = NotFound desc = could not find container \"865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7\": container with ID starting with 865daf527791fb42a7e38b3ccc019bcf19e002bf322605476e21aceb0aab4be7 not found: ID does not exist" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.661749 4553 scope.go:117] "RemoveContainer" containerID="433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d" Sep 30 19:50:55 crc kubenswrapper[4553]: E0930 19:50:55.662182 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d\": container with ID starting with 433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d not found: ID does not exist" containerID="433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.662291 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d"} err="failed to get container status \"433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d\": rpc error: code = NotFound desc = could not find container \"433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d\": container with ID starting with 433775455daced9402500b2f928308e29c64c51fa046fc1f0a6989a136987f2d not found: ID does not exist" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.681552 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjvx\" (UniqueName: \"kubernetes.io/projected/b60eae4e-80e4-4f1d-b7d9-7b498649fa67-kube-api-access-7mjvx\") pod \"nova-cell1-db-create-5p4j4\" (UID: \"b60eae4e-80e4-4f1d-b7d9-7b498649fa67\") " pod="openstack/nova-cell1-db-create-5p4j4" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.681838 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7nw\" (UniqueName: \"kubernetes.io/projected/78299d5b-ca49-4bfa-a23e-c81671ab07da-kube-api-access-qf7nw\") pod \"nova-cell0-db-create-f7n8l\" (UID: \"78299d5b-ca49-4bfa-a23e-c81671ab07da\") " pod="openstack/nova-cell0-db-create-f7n8l" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.698012 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7nw\" (UniqueName: \"kubernetes.io/projected/78299d5b-ca49-4bfa-a23e-c81671ab07da-kube-api-access-qf7nw\") pod \"nova-cell0-db-create-f7n8l\" (UID: \"78299d5b-ca49-4bfa-a23e-c81671ab07da\") " pod="openstack/nova-cell0-db-create-f7n8l" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.734593 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f7n8l" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.787649 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjvx\" (UniqueName: \"kubernetes.io/projected/b60eae4e-80e4-4f1d-b7d9-7b498649fa67-kube-api-access-7mjvx\") pod \"nova-cell1-db-create-5p4j4\" (UID: \"b60eae4e-80e4-4f1d-b7d9-7b498649fa67\") " pod="openstack/nova-cell1-db-create-5p4j4" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.834600 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjvx\" (UniqueName: \"kubernetes.io/projected/b60eae4e-80e4-4f1d-b7d9-7b498649fa67-kube-api-access-7mjvx\") pod \"nova-cell1-db-create-5p4j4\" (UID: \"b60eae4e-80e4-4f1d-b7d9-7b498649fa67\") " pod="openstack/nova-cell1-db-create-5p4j4" Sep 30 19:50:55 crc kubenswrapper[4553]: I0930 19:50:55.855513 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5p4j4" Sep 30 19:50:56 crc kubenswrapper[4553]: I0930 19:50:56.204631 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xm47s"] Sep 30 19:50:56 crc kubenswrapper[4553]: I0930 19:50:56.339071 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f7n8l"] Sep 30 19:50:56 crc kubenswrapper[4553]: W0930 19:50:56.342472 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78299d5b_ca49_4bfa_a23e_c81671ab07da.slice/crio-5b3828fdf7e98f620fbdde306d87a3aead345ebb72c9f244cc6e689e644950a8 WatchSource:0}: Error finding container 5b3828fdf7e98f620fbdde306d87a3aead345ebb72c9f244cc6e689e644950a8: Status 404 returned error can't find the container with id 5b3828fdf7e98f620fbdde306d87a3aead345ebb72c9f244cc6e689e644950a8 Sep 30 19:50:56 crc kubenswrapper[4553]: I0930 19:50:56.377818 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xm47s" event={"ID":"b972cf08-0eee-4970-8825-a313fdddc23a","Type":"ContainerStarted","Data":"aaa6ab095536443060a288044b28655f7289370b287e4c7b51bfa669d86c1b39"} Sep 30 19:50:56 crc kubenswrapper[4553]: I0930 19:50:56.379392 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f7n8l" event={"ID":"78299d5b-ca49-4bfa-a23e-c81671ab07da","Type":"ContainerStarted","Data":"5b3828fdf7e98f620fbdde306d87a3aead345ebb72c9f244cc6e689e644950a8"} Sep 30 19:50:56 crc kubenswrapper[4553]: W0930 19:50:56.473568 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb60eae4e_80e4_4f1d_b7d9_7b498649fa67.slice/crio-7205c7bc8ccb0ab0bc628dd79e8fef37713eeec98ffe2d3d19e785e342dc9242 WatchSource:0}: Error finding container 7205c7bc8ccb0ab0bc628dd79e8fef37713eeec98ffe2d3d19e785e342dc9242: Status 404 returned error can't find the container with id 7205c7bc8ccb0ab0bc628dd79e8fef37713eeec98ffe2d3d19e785e342dc9242 Sep 30 19:50:56 crc kubenswrapper[4553]: I0930 19:50:56.473842 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5p4j4"] Sep 30 19:50:57 crc kubenswrapper[4553]: I0930 19:50:57.389585 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xm47s" event={"ID":"b972cf08-0eee-4970-8825-a313fdddc23a","Type":"ContainerStarted","Data":"3433188d2a9a256515638e570b187059c1eacbe255285eb22922b90d5174393e"} Sep 30 19:50:57 crc kubenswrapper[4553]: I0930 19:50:57.391021 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f7n8l" event={"ID":"78299d5b-ca49-4bfa-a23e-c81671ab07da","Type":"ContainerStarted","Data":"d5003f330f3fbfc37766eced62d5c0613de075a76797714d11a43688cc571b13"} Sep 30 19:50:57 crc kubenswrapper[4553]: I0930 19:50:57.392299 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5p4j4" event={"ID":"b60eae4e-80e4-4f1d-b7d9-7b498649fa67","Type":"ContainerStarted","Data":"efcf634297d98cd1bd7e45f43d5d5992427b14be27599aabae44a40452c38274"} Sep 30 19:50:57 crc kubenswrapper[4553]: I0930 19:50:57.392329 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5p4j4" event={"ID":"b60eae4e-80e4-4f1d-b7d9-7b498649fa67","Type":"ContainerStarted","Data":"7205c7bc8ccb0ab0bc628dd79e8fef37713eeec98ffe2d3d19e785e342dc9242"} Sep 30 19:50:57 crc kubenswrapper[4553]: I0930 19:50:57.406950 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-xm47s" podStartSLOduration=2.406932032 podStartE2EDuration="2.406932032s" podCreationTimestamp="2025-09-30 19:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:57.401560418 +0000 UTC m=+1110.601062548" watchObservedRunningTime="2025-09-30 19:50:57.406932032 +0000 UTC m=+1110.606434162" Sep 30 19:50:57 crc kubenswrapper[4553]: I0930 19:50:57.422838 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-f7n8l" podStartSLOduration=2.422821678 podStartE2EDuration="2.422821678s" podCreationTimestamp="2025-09-30 19:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:57.416341624 +0000 UTC m=+1110.615843754" watchObservedRunningTime="2025-09-30 19:50:57.422821678 +0000 UTC m=+1110.622323808" Sep 30 19:50:57 crc kubenswrapper[4553]: I0930 19:50:57.436137 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-5p4j4" podStartSLOduration=2.436117306 podStartE2EDuration="2.436117306s" podCreationTimestamp="2025-09-30 19:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:50:57.431513771 +0000 UTC m=+1110.631015901" watchObservedRunningTime="2025-09-30 19:50:57.436117306 +0000 UTC m=+1110.635619436" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.412495 4553 generic.go:334] "Generic (PLEG): container finished" podID="78299d5b-ca49-4bfa-a23e-c81671ab07da" containerID="d5003f330f3fbfc37766eced62d5c0613de075a76797714d11a43688cc571b13" exitCode=0 Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.413899 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f7n8l" event={"ID":"78299d5b-ca49-4bfa-a23e-c81671ab07da","Type":"ContainerDied","Data":"d5003f330f3fbfc37766eced62d5c0613de075a76797714d11a43688cc571b13"} Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.416685 4553 generic.go:334] "Generic (PLEG): container finished" podID="e89116ff-9090-4e93-b28b-e2782c16f192" containerID="ab1a5ac675df971f36bece18ca185ad186b788f58ed257c02be4797583517d01" exitCode=0 Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.416784 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerDied","Data":"ab1a5ac675df971f36bece18ca185ad186b788f58ed257c02be4797583517d01"} Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.418217 4553 generic.go:334] "Generic (PLEG): container finished" podID="b60eae4e-80e4-4f1d-b7d9-7b498649fa67" containerID="efcf634297d98cd1bd7e45f43d5d5992427b14be27599aabae44a40452c38274" exitCode=0 Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.418330 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5p4j4" event={"ID":"b60eae4e-80e4-4f1d-b7d9-7b498649fa67","Type":"ContainerDied","Data":"efcf634297d98cd1bd7e45f43d5d5992427b14be27599aabae44a40452c38274"} Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.420200 4553 generic.go:334] "Generic (PLEG): container finished" podID="b972cf08-0eee-4970-8825-a313fdddc23a" containerID="3433188d2a9a256515638e570b187059c1eacbe255285eb22922b90d5174393e" exitCode=0 Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.420241 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xm47s" event={"ID":"b972cf08-0eee-4970-8825-a313fdddc23a","Type":"ContainerDied","Data":"3433188d2a9a256515638e570b187059c1eacbe255285eb22922b90d5174393e"} Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.547136 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.551498 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-log-httpd\") pod \"e89116ff-9090-4e93-b28b-e2782c16f192\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.551532 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-scripts\") pod \"e89116ff-9090-4e93-b28b-e2782c16f192\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.551568 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-combined-ca-bundle\") pod \"e89116ff-9090-4e93-b28b-e2782c16f192\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.551589 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67n9f\" (UniqueName: \"kubernetes.io/projected/e89116ff-9090-4e93-b28b-e2782c16f192-kube-api-access-67n9f\") pod \"e89116ff-9090-4e93-b28b-e2782c16f192\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.551632 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-config-data\") pod \"e89116ff-9090-4e93-b28b-e2782c16f192\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.551660 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-run-httpd\") pod \"e89116ff-9090-4e93-b28b-e2782c16f192\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.551710 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-sg-core-conf-yaml\") pod \"e89116ff-9090-4e93-b28b-e2782c16f192\" (UID: \"e89116ff-9090-4e93-b28b-e2782c16f192\") " Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.553221 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e89116ff-9090-4e93-b28b-e2782c16f192" (UID: "e89116ff-9090-4e93-b28b-e2782c16f192"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.553854 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e89116ff-9090-4e93-b28b-e2782c16f192" (UID: "e89116ff-9090-4e93-b28b-e2782c16f192"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.567015 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-scripts" (OuterVolumeSpecName: "scripts") pod "e89116ff-9090-4e93-b28b-e2782c16f192" (UID: "e89116ff-9090-4e93-b28b-e2782c16f192"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.567167 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89116ff-9090-4e93-b28b-e2782c16f192-kube-api-access-67n9f" (OuterVolumeSpecName: "kube-api-access-67n9f") pod "e89116ff-9090-4e93-b28b-e2782c16f192" (UID: "e89116ff-9090-4e93-b28b-e2782c16f192"). InnerVolumeSpecName "kube-api-access-67n9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.620277 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e89116ff-9090-4e93-b28b-e2782c16f192" (UID: "e89116ff-9090-4e93-b28b-e2782c16f192"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.653430 4553 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.653772 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.653782 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67n9f\" (UniqueName: \"kubernetes.io/projected/e89116ff-9090-4e93-b28b-e2782c16f192-kube-api-access-67n9f\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.653796 4553 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89116ff-9090-4e93-b28b-e2782c16f192-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.653804 4553 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.676271 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e89116ff-9090-4e93-b28b-e2782c16f192" (UID: "e89116ff-9090-4e93-b28b-e2782c16f192"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.703241 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-config-data" (OuterVolumeSpecName: "config-data") pod "e89116ff-9090-4e93-b28b-e2782c16f192" (UID: "e89116ff-9090-4e93-b28b-e2782c16f192"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.764702 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:58 crc kubenswrapper[4553]: I0930 19:50:58.764745 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89116ff-9090-4e93-b28b-e2782c16f192-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.429940 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89116ff-9090-4e93-b28b-e2782c16f192","Type":"ContainerDied","Data":"71883a66518f9895fe58df3c4da4dcd8cf958cd7f0ed6c820a15d7c5f98a7e21"} Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.430030 4553 scope.go:117] "RemoveContainer" containerID="00ee1f33b4c828339a4de7443e2130e156cb0ef654faf400a67a9a5a20a393fe" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.430675 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.471928 4553 scope.go:117] "RemoveContainer" containerID="03fa552d053a24845cfb7f4ec12f3a6073480d03333f46d970c161cd27b80abb" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.474660 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.486518 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.551361 4553 scope.go:117] "RemoveContainer" containerID="761da65a56b1a70f926d9aa211683057be02e99ed87dfb3b5ad4d0b7b2d15399" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.552944 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" path="/var/lib/kubelet/pods/e89116ff-9090-4e93-b28b-e2782c16f192/volumes" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.553724 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:59 crc kubenswrapper[4553]: E0930 19:50:59.553996 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="proxy-httpd" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.554011 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="proxy-httpd" Sep 30 19:50:59 crc kubenswrapper[4553]: E0930 19:50:59.554029 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="ceilometer-notification-agent" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.554048 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="ceilometer-notification-agent" Sep 30 19:50:59 crc kubenswrapper[4553]: E0930 19:50:59.554061 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="sg-core" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.554067 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="sg-core" Sep 30 19:50:59 crc kubenswrapper[4553]: E0930 19:50:59.554079 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="ceilometer-central-agent" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.554085 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="ceilometer-central-agent" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.554255 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="ceilometer-central-agent" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.554274 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="proxy-httpd" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.554286 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="sg-core" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.554299 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89116ff-9090-4e93-b28b-e2782c16f192" containerName="ceilometer-notification-agent" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.556898 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.556983 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.559668 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.559858 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.577630 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.577879 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-run-httpd\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.577927 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcz4k\" (UniqueName: \"kubernetes.io/projected/bffdc103-73a5-426d-af27-cb6efe0c9603-kube-api-access-dcz4k\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.577972 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-scripts\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.577999 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-log-httpd\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.578153 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.578178 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-config-data\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.585222 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.585284 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.586894 4553 scope.go:117] "RemoveContainer" containerID="ab1a5ac675df971f36bece18ca185ad186b788f58ed257c02be4797583517d01" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.680064 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-scripts\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.680120 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-log-httpd\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.680171 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.680197 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-config-data\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.680226 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.680261 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-run-httpd\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.680307 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcz4k\" (UniqueName: \"kubernetes.io/projected/bffdc103-73a5-426d-af27-cb6efe0c9603-kube-api-access-dcz4k\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.684163 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-log-httpd\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.694263 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.697354 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-run-httpd\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.698082 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-scripts\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.701372 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.709604 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcz4k\" (UniqueName: \"kubernetes.io/projected/bffdc103-73a5-426d-af27-cb6efe0c9603-kube-api-access-dcz4k\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.730255 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-config-data\") pod \"ceilometer-0\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.883101 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.922802 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5p4j4" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.965917 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f7n8l" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.987951 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf7nw\" (UniqueName: \"kubernetes.io/projected/78299d5b-ca49-4bfa-a23e-c81671ab07da-kube-api-access-qf7nw\") pod \"78299d5b-ca49-4bfa-a23e-c81671ab07da\" (UID: \"78299d5b-ca49-4bfa-a23e-c81671ab07da\") " Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.988033 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mjvx\" (UniqueName: \"kubernetes.io/projected/b60eae4e-80e4-4f1d-b7d9-7b498649fa67-kube-api-access-7mjvx\") pod \"b60eae4e-80e4-4f1d-b7d9-7b498649fa67\" (UID: \"b60eae4e-80e4-4f1d-b7d9-7b498649fa67\") " Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.992159 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60eae4e-80e4-4f1d-b7d9-7b498649fa67-kube-api-access-7mjvx" (OuterVolumeSpecName: "kube-api-access-7mjvx") pod "b60eae4e-80e4-4f1d-b7d9-7b498649fa67" (UID: "b60eae4e-80e4-4f1d-b7d9-7b498649fa67"). InnerVolumeSpecName "kube-api-access-7mjvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:50:59 crc kubenswrapper[4553]: I0930 19:50:59.992894 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78299d5b-ca49-4bfa-a23e-c81671ab07da-kube-api-access-qf7nw" (OuterVolumeSpecName: "kube-api-access-qf7nw") pod "78299d5b-ca49-4bfa-a23e-c81671ab07da" (UID: "78299d5b-ca49-4bfa-a23e-c81671ab07da"). InnerVolumeSpecName "kube-api-access-qf7nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.000346 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xm47s" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.088777 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-976k6\" (UniqueName: \"kubernetes.io/projected/b972cf08-0eee-4970-8825-a313fdddc23a-kube-api-access-976k6\") pod \"b972cf08-0eee-4970-8825-a313fdddc23a\" (UID: \"b972cf08-0eee-4970-8825-a313fdddc23a\") " Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.089120 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf7nw\" (UniqueName: \"kubernetes.io/projected/78299d5b-ca49-4bfa-a23e-c81671ab07da-kube-api-access-qf7nw\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.089138 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mjvx\" (UniqueName: \"kubernetes.io/projected/b60eae4e-80e4-4f1d-b7d9-7b498649fa67-kube-api-access-7mjvx\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.095153 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b972cf08-0eee-4970-8825-a313fdddc23a-kube-api-access-976k6" (OuterVolumeSpecName: "kube-api-access-976k6") pod "b972cf08-0eee-4970-8825-a313fdddc23a" (UID: "b972cf08-0eee-4970-8825-a313fdddc23a"). InnerVolumeSpecName "kube-api-access-976k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.191237 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-976k6\" (UniqueName: \"kubernetes.io/projected/b972cf08-0eee-4970-8825-a313fdddc23a-kube-api-access-976k6\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.387085 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.440335 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerStarted","Data":"051c2f2a3a3cab43f202020cefdcd7adc5a73a9dbc3dab2ce966bd1e13eaa840"} Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.442297 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xm47s" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.442697 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xm47s" event={"ID":"b972cf08-0eee-4970-8825-a313fdddc23a","Type":"ContainerDied","Data":"aaa6ab095536443060a288044b28655f7289370b287e4c7b51bfa669d86c1b39"} Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.442724 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaa6ab095536443060a288044b28655f7289370b287e4c7b51bfa669d86c1b39" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.445116 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f7n8l" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.445143 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f7n8l" event={"ID":"78299d5b-ca49-4bfa-a23e-c81671ab07da","Type":"ContainerDied","Data":"5b3828fdf7e98f620fbdde306d87a3aead345ebb72c9f244cc6e689e644950a8"} Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.445185 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b3828fdf7e98f620fbdde306d87a3aead345ebb72c9f244cc6e689e644950a8" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.448104 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5p4j4" event={"ID":"b60eae4e-80e4-4f1d-b7d9-7b498649fa67","Type":"ContainerDied","Data":"7205c7bc8ccb0ab0bc628dd79e8fef37713eeec98ffe2d3d19e785e342dc9242"} Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.448139 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7205c7bc8ccb0ab0bc628dd79e8fef37713eeec98ffe2d3d19e785e342dc9242" Sep 30 19:51:00 crc kubenswrapper[4553]: I0930 19:51:00.448196 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5p4j4" Sep 30 19:51:01 crc kubenswrapper[4553]: I0930 19:51:01.456776 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerStarted","Data":"75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b"} Sep 30 19:51:02 crc kubenswrapper[4553]: I0930 19:51:02.466049 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerStarted","Data":"5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704"} Sep 30 19:51:02 crc kubenswrapper[4553]: I0930 19:51:02.466288 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerStarted","Data":"82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a"} Sep 30 19:51:04 crc kubenswrapper[4553]: I0930 19:51:04.483113 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerStarted","Data":"20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d"} Sep 30 19:51:04 crc kubenswrapper[4553]: I0930 19:51:04.483589 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 19:51:04 crc kubenswrapper[4553]: I0930 19:51:04.506934 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.830529063 podStartE2EDuration="5.506916016s" podCreationTimestamp="2025-09-30 19:50:59 +0000 UTC" firstStartedPulling="2025-09-30 19:51:00.398616482 +0000 UTC m=+1113.598118612" lastFinishedPulling="2025-09-30 19:51:04.075003425 +0000 UTC m=+1117.274505565" observedRunningTime="2025-09-30 19:51:04.503931405 +0000 UTC m=+1117.703433545" watchObservedRunningTime="2025-09-30 19:51:04.506916016 +0000 UTC m=+1117.706418156" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.461267 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6d09-account-create-rf9p6"] Sep 30 19:51:05 crc kubenswrapper[4553]: E0930 19:51:05.461667 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b972cf08-0eee-4970-8825-a313fdddc23a" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.461688 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="b972cf08-0eee-4970-8825-a313fdddc23a" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: E0930 19:51:05.461705 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78299d5b-ca49-4bfa-a23e-c81671ab07da" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.461714 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="78299d5b-ca49-4bfa-a23e-c81671ab07da" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: E0930 19:51:05.461749 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60eae4e-80e4-4f1d-b7d9-7b498649fa67" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.461757 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60eae4e-80e4-4f1d-b7d9-7b498649fa67" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.461954 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="b972cf08-0eee-4970-8825-a313fdddc23a" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.461983 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60eae4e-80e4-4f1d-b7d9-7b498649fa67" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.462000 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="78299d5b-ca49-4bfa-a23e-c81671ab07da" containerName="mariadb-database-create" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.462672 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6d09-account-create-rf9p6" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.465788 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.475072 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6d09-account-create-rf9p6"] Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.482983 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcnww\" (UniqueName: \"kubernetes.io/projected/31b3010c-e679-4828-b02a-7c89c82d6f17-kube-api-access-kcnww\") pod \"nova-api-6d09-account-create-rf9p6\" (UID: \"31b3010c-e679-4828-b02a-7c89c82d6f17\") " pod="openstack/nova-api-6d09-account-create-rf9p6" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.595251 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcnww\" (UniqueName: \"kubernetes.io/projected/31b3010c-e679-4828-b02a-7c89c82d6f17-kube-api-access-kcnww\") pod \"nova-api-6d09-account-create-rf9p6\" (UID: \"31b3010c-e679-4828-b02a-7c89c82d6f17\") " pod="openstack/nova-api-6d09-account-create-rf9p6" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.644012 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcnww\" (UniqueName: \"kubernetes.io/projected/31b3010c-e679-4828-b02a-7c89c82d6f17-kube-api-access-kcnww\") pod \"nova-api-6d09-account-create-rf9p6\" (UID: \"31b3010c-e679-4828-b02a-7c89c82d6f17\") " pod="openstack/nova-api-6d09-account-create-rf9p6" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.654072 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8fd0-account-create-4j249"] Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.655272 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8fd0-account-create-4j249" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.658548 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.665273 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8fd0-account-create-4j249"] Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.799177 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcn88\" (UniqueName: \"kubernetes.io/projected/230810df-34fe-4a09-bf1a-ab53ba9faef4-kube-api-access-vcn88\") pod \"nova-cell0-8fd0-account-create-4j249\" (UID: \"230810df-34fe-4a09-bf1a-ab53ba9faef4\") " pod="openstack/nova-cell0-8fd0-account-create-4j249" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.812248 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6d09-account-create-rf9p6" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.900733 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcn88\" (UniqueName: \"kubernetes.io/projected/230810df-34fe-4a09-bf1a-ab53ba9faef4-kube-api-access-vcn88\") pod \"nova-cell0-8fd0-account-create-4j249\" (UID: \"230810df-34fe-4a09-bf1a-ab53ba9faef4\") " pod="openstack/nova-cell0-8fd0-account-create-4j249" Sep 30 19:51:05 crc kubenswrapper[4553]: I0930 19:51:05.918015 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcn88\" (UniqueName: \"kubernetes.io/projected/230810df-34fe-4a09-bf1a-ab53ba9faef4-kube-api-access-vcn88\") pod \"nova-cell0-8fd0-account-create-4j249\" (UID: \"230810df-34fe-4a09-bf1a-ab53ba9faef4\") " pod="openstack/nova-cell0-8fd0-account-create-4j249" Sep 30 19:51:06 crc kubenswrapper[4553]: I0930 19:51:06.002241 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8fd0-account-create-4j249" Sep 30 19:51:06 crc kubenswrapper[4553]: I0930 19:51:06.320234 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6d09-account-create-rf9p6"] Sep 30 19:51:06 crc kubenswrapper[4553]: I0930 19:51:06.501001 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8fd0-account-create-4j249"] Sep 30 19:51:06 crc kubenswrapper[4553]: I0930 19:51:06.525547 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6d09-account-create-rf9p6" event={"ID":"31b3010c-e679-4828-b02a-7c89c82d6f17","Type":"ContainerStarted","Data":"2eaa13bdb31054217b111abe7f3cb902fbbcf3b9155470335d81ac7cf01a1085"} Sep 30 19:51:06 crc kubenswrapper[4553]: I0930 19:51:06.525606 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6d09-account-create-rf9p6" event={"ID":"31b3010c-e679-4828-b02a-7c89c82d6f17","Type":"ContainerStarted","Data":"ded42ec5da1ce527d083b4d149c252583674aac8a222b635246fe82a88b59e4c"} Sep 30 19:51:06 crc kubenswrapper[4553]: I0930 19:51:06.542128 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-6d09-account-create-rf9p6" podStartSLOduration=1.542102638 podStartE2EDuration="1.542102638s" podCreationTimestamp="2025-09-30 19:51:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:51:06.538999065 +0000 UTC m=+1119.738501235" watchObservedRunningTime="2025-09-30 19:51:06.542102638 +0000 UTC m=+1119.741604788" Sep 30 19:51:07 crc kubenswrapper[4553]: I0930 19:51:07.542969 4553 generic.go:334] "Generic (PLEG): container finished" podID="230810df-34fe-4a09-bf1a-ab53ba9faef4" containerID="257cbe488113dfd3a6e4f366e53b530de6687a52dd4eb7065283437bfc71f0a9" exitCode=0 Sep 30 19:51:07 crc kubenswrapper[4553]: I0930 19:51:07.543527 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8fd0-account-create-4j249" event={"ID":"230810df-34fe-4a09-bf1a-ab53ba9faef4","Type":"ContainerDied","Data":"257cbe488113dfd3a6e4f366e53b530de6687a52dd4eb7065283437bfc71f0a9"} Sep 30 19:51:07 crc kubenswrapper[4553]: I0930 19:51:07.543573 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8fd0-account-create-4j249" event={"ID":"230810df-34fe-4a09-bf1a-ab53ba9faef4","Type":"ContainerStarted","Data":"033f67a1aace97d34a2f06094e4196c18b03959314657cb98bef725c0beb6f18"} Sep 30 19:51:07 crc kubenswrapper[4553]: I0930 19:51:07.545992 4553 generic.go:334] "Generic (PLEG): container finished" podID="31b3010c-e679-4828-b02a-7c89c82d6f17" containerID="2eaa13bdb31054217b111abe7f3cb902fbbcf3b9155470335d81ac7cf01a1085" exitCode=0 Sep 30 19:51:07 crc kubenswrapper[4553]: I0930 19:51:07.546068 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6d09-account-create-rf9p6" event={"ID":"31b3010c-e679-4828-b02a-7c89c82d6f17","Type":"ContainerDied","Data":"2eaa13bdb31054217b111abe7f3cb902fbbcf3b9155470335d81ac7cf01a1085"} Sep 30 19:51:08 crc kubenswrapper[4553]: I0930 19:51:08.993644 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8fd0-account-create-4j249" Sep 30 19:51:08 crc kubenswrapper[4553]: I0930 19:51:08.998634 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6d09-account-create-rf9p6" Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.181211 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcnww\" (UniqueName: \"kubernetes.io/projected/31b3010c-e679-4828-b02a-7c89c82d6f17-kube-api-access-kcnww\") pod \"31b3010c-e679-4828-b02a-7c89c82d6f17\" (UID: \"31b3010c-e679-4828-b02a-7c89c82d6f17\") " Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.181318 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcn88\" (UniqueName: \"kubernetes.io/projected/230810df-34fe-4a09-bf1a-ab53ba9faef4-kube-api-access-vcn88\") pod \"230810df-34fe-4a09-bf1a-ab53ba9faef4\" (UID: \"230810df-34fe-4a09-bf1a-ab53ba9faef4\") " Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.188293 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31b3010c-e679-4828-b02a-7c89c82d6f17-kube-api-access-kcnww" (OuterVolumeSpecName: "kube-api-access-kcnww") pod "31b3010c-e679-4828-b02a-7c89c82d6f17" (UID: "31b3010c-e679-4828-b02a-7c89c82d6f17"). InnerVolumeSpecName "kube-api-access-kcnww". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.188826 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230810df-34fe-4a09-bf1a-ab53ba9faef4-kube-api-access-vcn88" (OuterVolumeSpecName: "kube-api-access-vcn88") pod "230810df-34fe-4a09-bf1a-ab53ba9faef4" (UID: "230810df-34fe-4a09-bf1a-ab53ba9faef4"). InnerVolumeSpecName "kube-api-access-vcn88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.283637 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcnww\" (UniqueName: \"kubernetes.io/projected/31b3010c-e679-4828-b02a-7c89c82d6f17-kube-api-access-kcnww\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.283669 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcn88\" (UniqueName: \"kubernetes.io/projected/230810df-34fe-4a09-bf1a-ab53ba9faef4-kube-api-access-vcn88\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.568064 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8fd0-account-create-4j249" event={"ID":"230810df-34fe-4a09-bf1a-ab53ba9faef4","Type":"ContainerDied","Data":"033f67a1aace97d34a2f06094e4196c18b03959314657cb98bef725c0beb6f18"} Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.568636 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="033f67a1aace97d34a2f06094e4196c18b03959314657cb98bef725c0beb6f18" Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.568152 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8fd0-account-create-4j249" Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.572178 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6d09-account-create-rf9p6" event={"ID":"31b3010c-e679-4828-b02a-7c89c82d6f17","Type":"ContainerDied","Data":"ded42ec5da1ce527d083b4d149c252583674aac8a222b635246fe82a88b59e4c"} Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.572344 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ded42ec5da1ce527d083b4d149c252583674aac8a222b635246fe82a88b59e4c" Sep 30 19:51:09 crc kubenswrapper[4553]: I0930 19:51:09.572364 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6d09-account-create-rf9p6" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.921249 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndtgv"] Sep 30 19:51:10 crc kubenswrapper[4553]: E0930 19:51:10.921771 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31b3010c-e679-4828-b02a-7c89c82d6f17" containerName="mariadb-account-create" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.921786 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="31b3010c-e679-4828-b02a-7c89c82d6f17" containerName="mariadb-account-create" Sep 30 19:51:10 crc kubenswrapper[4553]: E0930 19:51:10.921801 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230810df-34fe-4a09-bf1a-ab53ba9faef4" containerName="mariadb-account-create" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.921808 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="230810df-34fe-4a09-bf1a-ab53ba9faef4" containerName="mariadb-account-create" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.921999 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="230810df-34fe-4a09-bf1a-ab53ba9faef4" containerName="mariadb-account-create" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.922014 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="31b3010c-e679-4828-b02a-7c89c82d6f17" containerName="mariadb-account-create" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.922572 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.927085 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.927246 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.927297 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-s2dpn" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.929079 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndtgv"] Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.935094 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z682\" (UniqueName: \"kubernetes.io/projected/6616a935-12f1-4f60-a206-1dbcfd9a6400-kube-api-access-2z682\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.935256 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.935373 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-config-data\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:10 crc kubenswrapper[4553]: I0930 19:51:10.935411 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-scripts\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.036905 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z682\" (UniqueName: \"kubernetes.io/projected/6616a935-12f1-4f60-a206-1dbcfd9a6400-kube-api-access-2z682\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.037011 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.037101 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-config-data\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.037129 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-scripts\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.045574 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-config-data\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.046700 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-scripts\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.048428 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.055870 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z682\" (UniqueName: \"kubernetes.io/projected/6616a935-12f1-4f60-a206-1dbcfd9a6400-kube-api-access-2z682\") pod \"nova-cell0-conductor-db-sync-ndtgv\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.245549 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:11 crc kubenswrapper[4553]: I0930 19:51:11.738336 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndtgv"] Sep 30 19:51:12 crc kubenswrapper[4553]: I0930 19:51:12.605897 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" event={"ID":"6616a935-12f1-4f60-a206-1dbcfd9a6400","Type":"ContainerStarted","Data":"63f8ab5eb38df5de5968952ed5cf85eb6395ec5991490ba7f844289e91acb5bf"} Sep 30 19:51:15 crc kubenswrapper[4553]: I0930 19:51:15.689660 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-eee0-account-create-x9tl2"] Sep 30 19:51:15 crc kubenswrapper[4553]: I0930 19:51:15.691175 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eee0-account-create-x9tl2" Sep 30 19:51:15 crc kubenswrapper[4553]: I0930 19:51:15.693438 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 19:51:15 crc kubenswrapper[4553]: I0930 19:51:15.701231 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eee0-account-create-x9tl2"] Sep 30 19:51:15 crc kubenswrapper[4553]: I0930 19:51:15.834336 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tc95\" (UniqueName: \"kubernetes.io/projected/82733d90-45f9-482e-a453-3b52a14b064e-kube-api-access-6tc95\") pod \"nova-cell1-eee0-account-create-x9tl2\" (UID: \"82733d90-45f9-482e-a453-3b52a14b064e\") " pod="openstack/nova-cell1-eee0-account-create-x9tl2" Sep 30 19:51:15 crc kubenswrapper[4553]: I0930 19:51:15.937329 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tc95\" (UniqueName: \"kubernetes.io/projected/82733d90-45f9-482e-a453-3b52a14b064e-kube-api-access-6tc95\") pod \"nova-cell1-eee0-account-create-x9tl2\" (UID: \"82733d90-45f9-482e-a453-3b52a14b064e\") " pod="openstack/nova-cell1-eee0-account-create-x9tl2" Sep 30 19:51:15 crc kubenswrapper[4553]: I0930 19:51:15.970324 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tc95\" (UniqueName: \"kubernetes.io/projected/82733d90-45f9-482e-a453-3b52a14b064e-kube-api-access-6tc95\") pod \"nova-cell1-eee0-account-create-x9tl2\" (UID: \"82733d90-45f9-482e-a453-3b52a14b064e\") " pod="openstack/nova-cell1-eee0-account-create-x9tl2" Sep 30 19:51:16 crc kubenswrapper[4553]: I0930 19:51:16.013092 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eee0-account-create-x9tl2" Sep 30 19:51:19 crc kubenswrapper[4553]: I0930 19:51:19.247957 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-eee0-account-create-x9tl2"] Sep 30 19:51:19 crc kubenswrapper[4553]: W0930 19:51:19.254479 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82733d90_45f9_482e_a453_3b52a14b064e.slice/crio-af841b7742c0c7155c76c5c6c4365c8f8c229f948c37efc947a128c6709af951 WatchSource:0}: Error finding container af841b7742c0c7155c76c5c6c4365c8f8c229f948c37efc947a128c6709af951: Status 404 returned error can't find the container with id af841b7742c0c7155c76c5c6c4365c8f8c229f948c37efc947a128c6709af951 Sep 30 19:51:19 crc kubenswrapper[4553]: I0930 19:51:19.698656 4553 generic.go:334] "Generic (PLEG): container finished" podID="82733d90-45f9-482e-a453-3b52a14b064e" containerID="2f7199265f206214b408faa606e54592ecbc0106599a615343faf7f12b2e50bd" exitCode=0 Sep 30 19:51:19 crc kubenswrapper[4553]: I0930 19:51:19.698699 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eee0-account-create-x9tl2" event={"ID":"82733d90-45f9-482e-a453-3b52a14b064e","Type":"ContainerDied","Data":"2f7199265f206214b408faa606e54592ecbc0106599a615343faf7f12b2e50bd"} Sep 30 19:51:19 crc kubenswrapper[4553]: I0930 19:51:19.698736 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eee0-account-create-x9tl2" event={"ID":"82733d90-45f9-482e-a453-3b52a14b064e","Type":"ContainerStarted","Data":"af841b7742c0c7155c76c5c6c4365c8f8c229f948c37efc947a128c6709af951"} Sep 30 19:51:19 crc kubenswrapper[4553]: I0930 19:51:19.700616 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" event={"ID":"6616a935-12f1-4f60-a206-1dbcfd9a6400","Type":"ContainerStarted","Data":"1a75ab9d5ade3578e48f97369ccb68cb1cbe534dab012e5aece7d85808809958"} Sep 30 19:51:21 crc kubenswrapper[4553]: I0930 19:51:21.051073 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eee0-account-create-x9tl2" Sep 30 19:51:21 crc kubenswrapper[4553]: I0930 19:51:21.111312 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" podStartSLOduration=4.177297894 podStartE2EDuration="11.111292874s" podCreationTimestamp="2025-09-30 19:51:10 +0000 UTC" firstStartedPulling="2025-09-30 19:51:11.747195263 +0000 UTC m=+1124.946697393" lastFinishedPulling="2025-09-30 19:51:18.681190253 +0000 UTC m=+1131.880692373" observedRunningTime="2025-09-30 19:51:19.739878562 +0000 UTC m=+1132.939380722" watchObservedRunningTime="2025-09-30 19:51:21.111292874 +0000 UTC m=+1134.310795004" Sep 30 19:51:21 crc kubenswrapper[4553]: I0930 19:51:21.149011 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tc95\" (UniqueName: \"kubernetes.io/projected/82733d90-45f9-482e-a453-3b52a14b064e-kube-api-access-6tc95\") pod \"82733d90-45f9-482e-a453-3b52a14b064e\" (UID: \"82733d90-45f9-482e-a453-3b52a14b064e\") " Sep 30 19:51:21 crc kubenswrapper[4553]: I0930 19:51:21.154784 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82733d90-45f9-482e-a453-3b52a14b064e-kube-api-access-6tc95" (OuterVolumeSpecName: "kube-api-access-6tc95") pod "82733d90-45f9-482e-a453-3b52a14b064e" (UID: "82733d90-45f9-482e-a453-3b52a14b064e"). InnerVolumeSpecName "kube-api-access-6tc95". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:21 crc kubenswrapper[4553]: I0930 19:51:21.251023 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tc95\" (UniqueName: \"kubernetes.io/projected/82733d90-45f9-482e-a453-3b52a14b064e-kube-api-access-6tc95\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:21 crc kubenswrapper[4553]: I0930 19:51:21.725308 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-eee0-account-create-x9tl2" event={"ID":"82733d90-45f9-482e-a453-3b52a14b064e","Type":"ContainerDied","Data":"af841b7742c0c7155c76c5c6c4365c8f8c229f948c37efc947a128c6709af951"} Sep 30 19:51:21 crc kubenswrapper[4553]: I0930 19:51:21.725351 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af841b7742c0c7155c76c5c6c4365c8f8c229f948c37efc947a128c6709af951" Sep 30 19:51:21 crc kubenswrapper[4553]: I0930 19:51:21.725354 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-eee0-account-create-x9tl2" Sep 30 19:51:29 crc kubenswrapper[4553]: I0930 19:51:29.585675 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:51:29 crc kubenswrapper[4553]: I0930 19:51:29.586515 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:51:29 crc kubenswrapper[4553]: I0930 19:51:29.844627 4553 generic.go:334] "Generic (PLEG): container finished" podID="6616a935-12f1-4f60-a206-1dbcfd9a6400" containerID="1a75ab9d5ade3578e48f97369ccb68cb1cbe534dab012e5aece7d85808809958" exitCode=0 Sep 30 19:51:29 crc kubenswrapper[4553]: I0930 19:51:29.844687 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" event={"ID":"6616a935-12f1-4f60-a206-1dbcfd9a6400","Type":"ContainerDied","Data":"1a75ab9d5ade3578e48f97369ccb68cb1cbe534dab012e5aece7d85808809958"} Sep 30 19:51:29 crc kubenswrapper[4553]: I0930 19:51:29.901789 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.199556 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.388847 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-config-data\") pod \"6616a935-12f1-4f60-a206-1dbcfd9a6400\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.389018 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-scripts\") pod \"6616a935-12f1-4f60-a206-1dbcfd9a6400\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.389107 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z682\" (UniqueName: \"kubernetes.io/projected/6616a935-12f1-4f60-a206-1dbcfd9a6400-kube-api-access-2z682\") pod \"6616a935-12f1-4f60-a206-1dbcfd9a6400\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.389129 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-combined-ca-bundle\") pod \"6616a935-12f1-4f60-a206-1dbcfd9a6400\" (UID: \"6616a935-12f1-4f60-a206-1dbcfd9a6400\") " Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.403246 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6616a935-12f1-4f60-a206-1dbcfd9a6400-kube-api-access-2z682" (OuterVolumeSpecName: "kube-api-access-2z682") pod "6616a935-12f1-4f60-a206-1dbcfd9a6400" (UID: "6616a935-12f1-4f60-a206-1dbcfd9a6400"). InnerVolumeSpecName "kube-api-access-2z682". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.403568 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-scripts" (OuterVolumeSpecName: "scripts") pod "6616a935-12f1-4f60-a206-1dbcfd9a6400" (UID: "6616a935-12f1-4f60-a206-1dbcfd9a6400"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.422632 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6616a935-12f1-4f60-a206-1dbcfd9a6400" (UID: "6616a935-12f1-4f60-a206-1dbcfd9a6400"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.441058 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-config-data" (OuterVolumeSpecName: "config-data") pod "6616a935-12f1-4f60-a206-1dbcfd9a6400" (UID: "6616a935-12f1-4f60-a206-1dbcfd9a6400"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.490702 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.490732 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z682\" (UniqueName: \"kubernetes.io/projected/6616a935-12f1-4f60-a206-1dbcfd9a6400-kube-api-access-2z682\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.490745 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.490753 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6616a935-12f1-4f60-a206-1dbcfd9a6400-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.873834 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" event={"ID":"6616a935-12f1-4f60-a206-1dbcfd9a6400","Type":"ContainerDied","Data":"63f8ab5eb38df5de5968952ed5cf85eb6395ec5991490ba7f844289e91acb5bf"} Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.873902 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63f8ab5eb38df5de5968952ed5cf85eb6395ec5991490ba7f844289e91acb5bf" Sep 30 19:51:31 crc kubenswrapper[4553]: I0930 19:51:31.874116 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ndtgv" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.031702 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 19:51:32 crc kubenswrapper[4553]: E0930 19:51:32.032070 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6616a935-12f1-4f60-a206-1dbcfd9a6400" containerName="nova-cell0-conductor-db-sync" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.032090 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="6616a935-12f1-4f60-a206-1dbcfd9a6400" containerName="nova-cell0-conductor-db-sync" Sep 30 19:51:32 crc kubenswrapper[4553]: E0930 19:51:32.032104 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82733d90-45f9-482e-a453-3b52a14b064e" containerName="mariadb-account-create" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.032111 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="82733d90-45f9-482e-a453-3b52a14b064e" containerName="mariadb-account-create" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.032271 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="6616a935-12f1-4f60-a206-1dbcfd9a6400" containerName="nova-cell0-conductor-db-sync" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.032280 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="82733d90-45f9-482e-a453-3b52a14b064e" containerName="mariadb-account-create" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.032836 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.039581 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.039856 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-s2dpn" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.052103 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.204829 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n8x6\" (UniqueName: \"kubernetes.io/projected/df4bf445-73fe-492e-9104-f1a0879510d4-kube-api-access-4n8x6\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.204964 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4bf445-73fe-492e-9104-f1a0879510d4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.204996 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4bf445-73fe-492e-9104-f1a0879510d4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.306855 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n8x6\" (UniqueName: \"kubernetes.io/projected/df4bf445-73fe-492e-9104-f1a0879510d4-kube-api-access-4n8x6\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.307197 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4bf445-73fe-492e-9104-f1a0879510d4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.307295 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4bf445-73fe-492e-9104-f1a0879510d4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.312183 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4bf445-73fe-492e-9104-f1a0879510d4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.312577 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4bf445-73fe-492e-9104-f1a0879510d4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.333694 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n8x6\" (UniqueName: \"kubernetes.io/projected/df4bf445-73fe-492e-9104-f1a0879510d4-kube-api-access-4n8x6\") pod \"nova-cell0-conductor-0\" (UID: \"df4bf445-73fe-492e-9104-f1a0879510d4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.355834 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.823325 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 19:51:32 crc kubenswrapper[4553]: W0930 19:51:32.828172 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf4bf445_73fe_492e_9104_f1a0879510d4.slice/crio-df85b8bbd828de624d472759be53f2f45b76fff3f68668ac99e73f935dfee806 WatchSource:0}: Error finding container df85b8bbd828de624d472759be53f2f45b76fff3f68668ac99e73f935dfee806: Status 404 returned error can't find the container with id df85b8bbd828de624d472759be53f2f45b76fff3f68668ac99e73f935dfee806 Sep 30 19:51:32 crc kubenswrapper[4553]: I0930 19:51:32.884082 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"df4bf445-73fe-492e-9104-f1a0879510d4","Type":"ContainerStarted","Data":"df85b8bbd828de624d472759be53f2f45b76fff3f68668ac99e73f935dfee806"} Sep 30 19:51:33 crc kubenswrapper[4553]: I0930 19:51:33.893494 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"df4bf445-73fe-492e-9104-f1a0879510d4","Type":"ContainerStarted","Data":"75f0ed9bdff51f746466e0e435f806e0b48b72c57a5900c1284fce248e79a888"} Sep 30 19:51:33 crc kubenswrapper[4553]: I0930 19:51:33.894184 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.029363 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.029344951 podStartE2EDuration="2.029344951s" podCreationTimestamp="2025-09-30 19:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:51:33.921333062 +0000 UTC m=+1147.120835202" watchObservedRunningTime="2025-09-30 19:51:34.029344951 +0000 UTC m=+1147.228847081" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.035003 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.035210 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c828a401-ebca-4e9d-850e-d6f74d380257" containerName="kube-state-metrics" containerID="cri-o://558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff" gracePeriod=30 Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.052486 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="c828a401-ebca-4e9d-850e-d6f74d380257" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": dial tcp 10.217.0.107:8081: connect: connection refused" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.496559 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.649625 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwst9\" (UniqueName: \"kubernetes.io/projected/c828a401-ebca-4e9d-850e-d6f74d380257-kube-api-access-vwst9\") pod \"c828a401-ebca-4e9d-850e-d6f74d380257\" (UID: \"c828a401-ebca-4e9d-850e-d6f74d380257\") " Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.666788 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c828a401-ebca-4e9d-850e-d6f74d380257-kube-api-access-vwst9" (OuterVolumeSpecName: "kube-api-access-vwst9") pod "c828a401-ebca-4e9d-850e-d6f74d380257" (UID: "c828a401-ebca-4e9d-850e-d6f74d380257"). InnerVolumeSpecName "kube-api-access-vwst9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.751716 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwst9\" (UniqueName: \"kubernetes.io/projected/c828a401-ebca-4e9d-850e-d6f74d380257-kube-api-access-vwst9\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.904082 4553 generic.go:334] "Generic (PLEG): container finished" podID="c828a401-ebca-4e9d-850e-d6f74d380257" containerID="558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff" exitCode=2 Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.904966 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.906158 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c828a401-ebca-4e9d-850e-d6f74d380257","Type":"ContainerDied","Data":"558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff"} Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.906201 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c828a401-ebca-4e9d-850e-d6f74d380257","Type":"ContainerDied","Data":"7157764247c0e01afa00a9fa4c140f85e0657f7160647fe86a27263990541d94"} Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.906223 4553 scope.go:117] "RemoveContainer" containerID="558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.947765 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.957842 4553 scope.go:117] "RemoveContainer" containerID="558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.958506 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:51:34 crc kubenswrapper[4553]: E0930 19:51:34.959417 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff\": container with ID starting with 558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff not found: ID does not exist" containerID="558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.959449 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff"} err="failed to get container status \"558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff\": rpc error: code = NotFound desc = could not find container \"558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff\": container with ID starting with 558b77ad7d87b080239c53e2ff99ca3294d8daddd4d93e5fb4e9acb11f8eabff not found: ID does not exist" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.974340 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:51:34 crc kubenswrapper[4553]: E0930 19:51:34.986844 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c828a401-ebca-4e9d-850e-d6f74d380257" containerName="kube-state-metrics" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.987113 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="c828a401-ebca-4e9d-850e-d6f74d380257" containerName="kube-state-metrics" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.987623 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="c828a401-ebca-4e9d-850e-d6f74d380257" containerName="kube-state-metrics" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.988817 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.992250 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 19:51:34 crc kubenswrapper[4553]: I0930 19:51:34.993956 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.002252 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.063580 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.063651 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.063794 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.063962 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhggk\" (UniqueName: \"kubernetes.io/projected/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-api-access-fhggk\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.165301 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.165578 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.165688 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.165822 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhggk\" (UniqueName: \"kubernetes.io/projected/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-api-access-fhggk\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.171772 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.171806 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.172354 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.189905 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhggk\" (UniqueName: \"kubernetes.io/projected/6093f7c3-b483-4b10-89c1-ea1a4c118ca7-kube-api-access-fhggk\") pod \"kube-state-metrics-0\" (UID: \"6093f7c3-b483-4b10-89c1-ea1a4c118ca7\") " pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.309474 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.522575 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c828a401-ebca-4e9d-850e-d6f74d380257" path="/var/lib/kubelet/pods/c828a401-ebca-4e9d-850e-d6f74d380257/volumes" Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.746597 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.911972 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6093f7c3-b483-4b10-89c1-ea1a4c118ca7","Type":"ContainerStarted","Data":"0fb6f2406586bbb5d77466be09576ff17593a62f1698e4fb597c1eb0d4adc4b4"} Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.963253 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.963518 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="ceilometer-central-agent" containerID="cri-o://75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b" gracePeriod=30 Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.963579 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="proxy-httpd" containerID="cri-o://20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d" gracePeriod=30 Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.963604 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="ceilometer-notification-agent" containerID="cri-o://82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a" gracePeriod=30 Sep 30 19:51:35 crc kubenswrapper[4553]: I0930 19:51:35.963648 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="sg-core" containerID="cri-o://5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704" gracePeriod=30 Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.924091 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6093f7c3-b483-4b10-89c1-ea1a4c118ca7","Type":"ContainerStarted","Data":"7a64d5b2db27774782e1fbbc89e57f83b5a6f45d84035eacb050df6d1917ff81"} Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.925318 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.928955 4553 generic.go:334] "Generic (PLEG): container finished" podID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerID="20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d" exitCode=0 Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.929002 4553 generic.go:334] "Generic (PLEG): container finished" podID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerID="5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704" exitCode=2 Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.929015 4553 generic.go:334] "Generic (PLEG): container finished" podID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerID="75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b" exitCode=0 Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.929058 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerDied","Data":"20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d"} Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.929092 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerDied","Data":"5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704"} Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.929106 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerDied","Data":"75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b"} Sep 30 19:51:36 crc kubenswrapper[4553]: I0930 19:51:36.946227 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.590064156 podStartE2EDuration="2.946211613s" podCreationTimestamp="2025-09-30 19:51:34 +0000 UTC" firstStartedPulling="2025-09-30 19:51:35.753648961 +0000 UTC m=+1148.953151091" lastFinishedPulling="2025-09-30 19:51:36.109796418 +0000 UTC m=+1149.309298548" observedRunningTime="2025-09-30 19:51:36.942068881 +0000 UTC m=+1150.141571011" watchObservedRunningTime="2025-09-30 19:51:36.946211613 +0000 UTC m=+1150.145713743" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.497402 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.610415 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-config-data\") pod \"bffdc103-73a5-426d-af27-cb6efe0c9603\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.610471 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-combined-ca-bundle\") pod \"bffdc103-73a5-426d-af27-cb6efe0c9603\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.610565 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-sg-core-conf-yaml\") pod \"bffdc103-73a5-426d-af27-cb6efe0c9603\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.610581 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-scripts\") pod \"bffdc103-73a5-426d-af27-cb6efe0c9603\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.610600 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcz4k\" (UniqueName: \"kubernetes.io/projected/bffdc103-73a5-426d-af27-cb6efe0c9603-kube-api-access-dcz4k\") pod \"bffdc103-73a5-426d-af27-cb6efe0c9603\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.610625 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-run-httpd\") pod \"bffdc103-73a5-426d-af27-cb6efe0c9603\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.610661 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-log-httpd\") pod \"bffdc103-73a5-426d-af27-cb6efe0c9603\" (UID: \"bffdc103-73a5-426d-af27-cb6efe0c9603\") " Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.611610 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bffdc103-73a5-426d-af27-cb6efe0c9603" (UID: "bffdc103-73a5-426d-af27-cb6efe0c9603"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.612005 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bffdc103-73a5-426d-af27-cb6efe0c9603" (UID: "bffdc103-73a5-426d-af27-cb6efe0c9603"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.616327 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-scripts" (OuterVolumeSpecName: "scripts") pod "bffdc103-73a5-426d-af27-cb6efe0c9603" (UID: "bffdc103-73a5-426d-af27-cb6efe0c9603"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.633325 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bffdc103-73a5-426d-af27-cb6efe0c9603-kube-api-access-dcz4k" (OuterVolumeSpecName: "kube-api-access-dcz4k") pod "bffdc103-73a5-426d-af27-cb6efe0c9603" (UID: "bffdc103-73a5-426d-af27-cb6efe0c9603"). InnerVolumeSpecName "kube-api-access-dcz4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.647876 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bffdc103-73a5-426d-af27-cb6efe0c9603" (UID: "bffdc103-73a5-426d-af27-cb6efe0c9603"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.685269 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bffdc103-73a5-426d-af27-cb6efe0c9603" (UID: "bffdc103-73a5-426d-af27-cb6efe0c9603"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.712987 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.713018 4553 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.713027 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.713054 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcz4k\" (UniqueName: \"kubernetes.io/projected/bffdc103-73a5-426d-af27-cb6efe0c9603-kube-api-access-dcz4k\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.713065 4553 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.713073 4553 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bffdc103-73a5-426d-af27-cb6efe0c9603-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.724316 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-config-data" (OuterVolumeSpecName: "config-data") pod "bffdc103-73a5-426d-af27-cb6efe0c9603" (UID: "bffdc103-73a5-426d-af27-cb6efe0c9603"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.814985 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffdc103-73a5-426d-af27-cb6efe0c9603-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.945118 4553 generic.go:334] "Generic (PLEG): container finished" podID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerID="82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a" exitCode=0 Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.945166 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.945215 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerDied","Data":"82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a"} Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.945291 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bffdc103-73a5-426d-af27-cb6efe0c9603","Type":"ContainerDied","Data":"051c2f2a3a3cab43f202020cefdcd7adc5a73a9dbc3dab2ce966bd1e13eaa840"} Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.945313 4553 scope.go:117] "RemoveContainer" containerID="20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.979031 4553 scope.go:117] "RemoveContainer" containerID="5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704" Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.988154 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:51:37 crc kubenswrapper[4553]: I0930 19:51:37.996967 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.014347 4553 scope.go:117] "RemoveContainer" containerID="82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.034780 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:51:38 crc kubenswrapper[4553]: E0930 19:51:38.035700 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="ceilometer-central-agent" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.035721 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="ceilometer-central-agent" Sep 30 19:51:38 crc kubenswrapper[4553]: E0930 19:51:38.035735 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="proxy-httpd" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.035741 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="proxy-httpd" Sep 30 19:51:38 crc kubenswrapper[4553]: E0930 19:51:38.035769 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="ceilometer-notification-agent" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.035775 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="ceilometer-notification-agent" Sep 30 19:51:38 crc kubenswrapper[4553]: E0930 19:51:38.035810 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="sg-core" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.035820 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="sg-core" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.037945 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="proxy-httpd" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.037969 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="sg-core" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.037989 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="ceilometer-central-agent" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.038004 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" containerName="ceilometer-notification-agent" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.047623 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.052332 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.052708 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.053125 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.060475 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.077249 4553 scope.go:117] "RemoveContainer" containerID="75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.095975 4553 scope.go:117] "RemoveContainer" containerID="20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d" Sep 30 19:51:38 crc kubenswrapper[4553]: E0930 19:51:38.096901 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d\": container with ID starting with 20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d not found: ID does not exist" containerID="20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.096940 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d"} err="failed to get container status \"20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d\": rpc error: code = NotFound desc = could not find container \"20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d\": container with ID starting with 20d524eecf8e47a5dfc74efd3b4282d465ff7cc37368f4ce1d4d20c1f33ec49d not found: ID does not exist" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.096967 4553 scope.go:117] "RemoveContainer" containerID="5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704" Sep 30 19:51:38 crc kubenswrapper[4553]: E0930 19:51:38.097788 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704\": container with ID starting with 5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704 not found: ID does not exist" containerID="5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.097820 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704"} err="failed to get container status \"5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704\": rpc error: code = NotFound desc = could not find container \"5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704\": container with ID starting with 5548e03e6a26d21dd3c4ce6f61aca55919e298cdb5a0f45ee19f9ba5dc63d704 not found: ID does not exist" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.097839 4553 scope.go:117] "RemoveContainer" containerID="82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a" Sep 30 19:51:38 crc kubenswrapper[4553]: E0930 19:51:38.098149 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a\": container with ID starting with 82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a not found: ID does not exist" containerID="82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.098175 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a"} err="failed to get container status \"82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a\": rpc error: code = NotFound desc = could not find container \"82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a\": container with ID starting with 82ac53f043f89a5cb7cde4a325fde041898ca52468f9f3ad06ecb0fdc764062a not found: ID does not exist" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.098191 4553 scope.go:117] "RemoveContainer" containerID="75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b" Sep 30 19:51:38 crc kubenswrapper[4553]: E0930 19:51:38.099261 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b\": container with ID starting with 75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b not found: ID does not exist" containerID="75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.099350 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b"} err="failed to get container status \"75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b\": rpc error: code = NotFound desc = could not find container \"75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b\": container with ID starting with 75c7105a4c59d9a39223603003be2e92e264d5c999dba8759e28681ddb2a778b not found: ID does not exist" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.146593 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsfnl\" (UniqueName: \"kubernetes.io/projected/f4d66447-e03a-4cc3-9cf6-c99358e848de-kube-api-access-lsfnl\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.146633 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-run-httpd\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.146672 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.146695 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-log-httpd\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.146751 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-scripts\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.146766 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.146803 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-config-data\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.146841 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.248304 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.248356 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-log-httpd\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.248419 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-scripts\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.248437 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.248475 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-config-data\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.248515 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.248543 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsfnl\" (UniqueName: \"kubernetes.io/projected/f4d66447-e03a-4cc3-9cf6-c99358e848de-kube-api-access-lsfnl\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.248567 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-run-httpd\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.249156 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-log-httpd\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.249170 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-run-httpd\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.252500 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.253777 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.253842 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.254957 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-scripts\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.255399 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-config-data\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.267625 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsfnl\" (UniqueName: \"kubernetes.io/projected/f4d66447-e03a-4cc3-9cf6-c99358e848de-kube-api-access-lsfnl\") pod \"ceilometer-0\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.374216 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.836483 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:51:38 crc kubenswrapper[4553]: I0930 19:51:38.959874 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerStarted","Data":"19ddf8debbfcffeed13df609b2cc33cdc644e7ccb46d5db89009f09d5f4e44a0"} Sep 30 19:51:39 crc kubenswrapper[4553]: I0930 19:51:39.514723 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bffdc103-73a5-426d-af27-cb6efe0c9603" path="/var/lib/kubelet/pods/bffdc103-73a5-426d-af27-cb6efe0c9603/volumes" Sep 30 19:51:39 crc kubenswrapper[4553]: I0930 19:51:39.973498 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerStarted","Data":"7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a"} Sep 30 19:51:40 crc kubenswrapper[4553]: I0930 19:51:40.988597 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerStarted","Data":"79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429"} Sep 30 19:51:40 crc kubenswrapper[4553]: I0930 19:51:40.988849 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerStarted","Data":"694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1"} Sep 30 19:51:42 crc kubenswrapper[4553]: I0930 19:51:42.391283 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 19:51:42 crc kubenswrapper[4553]: I0930 19:51:42.971052 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ghdd6"] Sep 30 19:51:42 crc kubenswrapper[4553]: I0930 19:51:42.972677 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:42 crc kubenswrapper[4553]: I0930 19:51:42.976232 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 19:51:42 crc kubenswrapper[4553]: I0930 19:51:42.976414 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 19:51:42 crc kubenswrapper[4553]: I0930 19:51:42.992377 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ghdd6"] Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.006201 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerStarted","Data":"1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78"} Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.006998 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.045674 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-config-data\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.045723 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hkw7\" (UniqueName: \"kubernetes.io/projected/321c9e7b-0cfd-440b-a1c9-664990e119c5-kube-api-access-4hkw7\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.045838 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-scripts\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.045859 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.147116 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-scripts\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.147179 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.147283 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-config-data\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.147303 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hkw7\" (UniqueName: \"kubernetes.io/projected/321c9e7b-0cfd-440b-a1c9-664990e119c5-kube-api-access-4hkw7\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.159606 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-scripts\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.161939 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.166679 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-config-data\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.188002 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hkw7\" (UniqueName: \"kubernetes.io/projected/321c9e7b-0cfd-440b-a1c9-664990e119c5-kube-api-access-4hkw7\") pod \"nova-cell0-cell-mapping-ghdd6\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.337026 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.402443 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.752168289 podStartE2EDuration="6.402426441s" podCreationTimestamp="2025-09-30 19:51:37 +0000 UTC" firstStartedPulling="2025-09-30 19:51:38.849236809 +0000 UTC m=+1152.048738949" lastFinishedPulling="2025-09-30 19:51:42.499494971 +0000 UTC m=+1155.698997101" observedRunningTime="2025-09-30 19:51:43.129646931 +0000 UTC m=+1156.329149071" watchObservedRunningTime="2025-09-30 19:51:43.402426441 +0000 UTC m=+1156.601928571" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.405099 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.412823 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.431724 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.455455 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.455545 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkstj\" (UniqueName: \"kubernetes.io/projected/10d8e670-bb79-46a5-885a-35deb8d0ab28-kube-api-access-xkstj\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.455594 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-config-data\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.456962 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.558990 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkstj\" (UniqueName: \"kubernetes.io/projected/10d8e670-bb79-46a5-885a-35deb8d0ab28-kube-api-access-xkstj\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.559072 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-config-data\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.559142 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.568836 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.593349 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-config-data\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.623637 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkstj\" (UniqueName: \"kubernetes.io/projected/10d8e670-bb79-46a5-885a-35deb8d0ab28-kube-api-access-xkstj\") pod \"nova-scheduler-0\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.741969 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.836959 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.838573 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.844628 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.870558 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.872106 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.875275 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.898131 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.942249 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.943438 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.950269 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 19:51:43 crc kubenswrapper[4553]: I0930 19:51:43.975238 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.001845 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.002065 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-logs\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.002096 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-config-data\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.002139 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqxz\" (UniqueName: \"kubernetes.io/projected/0ab72afb-ab85-433a-9305-f157654c6755-kube-api-access-frqxz\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.002231 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.003300 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-config-data\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.003379 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmll\" (UniqueName: \"kubernetes.io/projected/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-kube-api-access-9bmll\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.003396 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ab72afb-ab85-433a-9305-f157654c6755-logs\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.077631 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.117986 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-bvvj2"] Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.135831 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141407 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-config-data\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141456 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-logs\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141521 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frqxz\" (UniqueName: \"kubernetes.io/projected/0ab72afb-ab85-433a-9305-f157654c6755-kube-api-access-frqxz\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141552 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141576 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-config-data\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141657 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmll\" (UniqueName: \"kubernetes.io/projected/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-kube-api-access-9bmll\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141678 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ab72afb-ab85-433a-9305-f157654c6755-logs\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141759 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jt7l\" (UniqueName: \"kubernetes.io/projected/e815e92c-4105-40fd-90e6-a17d35cdf5c6-kube-api-access-4jt7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141827 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.141899 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.143969 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-logs\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.145810 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ab72afb-ab85-433a-9305-f157654c6755-logs\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.167807 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-config-data\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.184229 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.184791 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-config-data\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.196781 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.217360 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.237127 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmll\" (UniqueName: \"kubernetes.io/projected/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-kube-api-access-9bmll\") pod \"nova-api-0\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.253852 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jt7l\" (UniqueName: \"kubernetes.io/projected/e815e92c-4105-40fd-90e6-a17d35cdf5c6-kube-api-access-4jt7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.254007 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.256470 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.261878 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frqxz\" (UniqueName: \"kubernetes.io/projected/0ab72afb-ab85-433a-9305-f157654c6755-kube-api-access-frqxz\") pod \"nova-metadata-0\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.261958 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ghdd6"] Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.271606 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.288253 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.301439 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jt7l\" (UniqueName: \"kubernetes.io/projected/e815e92c-4105-40fd-90e6-a17d35cdf5c6-kube-api-access-4jt7l\") pod \"nova-cell1-novncproxy-0\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.310134 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.318133 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.326967 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-bvvj2"] Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.345615 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.360566 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.360614 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.360640 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq9ll\" (UniqueName: \"kubernetes.io/projected/7f99a4a3-362d-4fbc-a960-4d1048895160-kube-api-access-gq9ll\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.360663 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-config\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.360715 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.360800 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-svc\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.462292 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-svc\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.462371 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.462397 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.462419 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq9ll\" (UniqueName: \"kubernetes.io/projected/7f99a4a3-362d-4fbc-a960-4d1048895160-kube-api-access-gq9ll\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.462463 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-config\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.462805 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.463901 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-svc\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.463912 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.465856 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.466340 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-config\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.466816 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.496331 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq9ll\" (UniqueName: \"kubernetes.io/projected/7f99a4a3-362d-4fbc-a960-4d1048895160-kube-api-access-gq9ll\") pod \"dnsmasq-dns-757b4f8459-bvvj2\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.609457 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.678986 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:51:44 crc kubenswrapper[4553]: I0930 19:51:44.972675 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.091835 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83fbb883-5c68-4d9e-b446-5c4292bfd3d6","Type":"ContainerStarted","Data":"fe0098b7d8a0f7f8d4b4e356d590ae4552f39c190b0fda8af89d03a9231401fe"} Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.101175 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"10d8e670-bb79-46a5-885a-35deb8d0ab28","Type":"ContainerStarted","Data":"c40a4eb0d54b6f23b1dc70b4ae0ba5c1555cd27df47c7eb38a6031c2ae210e86"} Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.118728 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ghdd6" event={"ID":"321c9e7b-0cfd-440b-a1c9-664990e119c5","Type":"ContainerStarted","Data":"2b7e088975f22e9c523ab829ac918e8e87d330179b98e3ef7184a1d1c89d5256"} Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.118763 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ghdd6" event={"ID":"321c9e7b-0cfd-440b-a1c9-664990e119c5","Type":"ContainerStarted","Data":"9c37bf92d3d997a08a1965b8cafd3efba0c5ba484ef388e64996181ee186eec2"} Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.162121 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.176538 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.182666 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ghdd6" podStartSLOduration=3.182641102 podStartE2EDuration="3.182641102s" podCreationTimestamp="2025-09-30 19:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:51:45.14383589 +0000 UTC m=+1158.343338010" watchObservedRunningTime="2025-09-30 19:51:45.182641102 +0000 UTC m=+1158.382143232" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.291306 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-bvvj2"] Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.338508 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.379971 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xf5kh"] Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.381496 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.395182 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.395572 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.454117 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xf5kh"] Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.528159 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.528442 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-config-data\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.528473 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8q2x\" (UniqueName: \"kubernetes.io/projected/1722205d-27ba-4709-bca4-744114e7f16f-kube-api-access-g8q2x\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.528490 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-scripts\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.630475 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.630519 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-config-data\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.630547 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8q2x\" (UniqueName: \"kubernetes.io/projected/1722205d-27ba-4709-bca4-744114e7f16f-kube-api-access-g8q2x\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.630567 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-scripts\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.652839 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-scripts\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.653417 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.656737 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-config-data\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.658121 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8q2x\" (UniqueName: \"kubernetes.io/projected/1722205d-27ba-4709-bca4-744114e7f16f-kube-api-access-g8q2x\") pod \"nova-cell1-conductor-db-sync-xf5kh\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:45 crc kubenswrapper[4553]: I0930 19:51:45.720987 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:46 crc kubenswrapper[4553]: I0930 19:51:46.147271 4553 generic.go:334] "Generic (PLEG): container finished" podID="7f99a4a3-362d-4fbc-a960-4d1048895160" containerID="6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf" exitCode=0 Sep 30 19:51:46 crc kubenswrapper[4553]: I0930 19:51:46.147572 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" event={"ID":"7f99a4a3-362d-4fbc-a960-4d1048895160","Type":"ContainerDied","Data":"6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf"} Sep 30 19:51:46 crc kubenswrapper[4553]: I0930 19:51:46.147598 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" event={"ID":"7f99a4a3-362d-4fbc-a960-4d1048895160","Type":"ContainerStarted","Data":"31716bff2ab1519fc8ac08d6cbf474f4ab7c04b2a757584e8f3f3098e96d6ca2"} Sep 30 19:51:46 crc kubenswrapper[4553]: I0930 19:51:46.157148 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ab72afb-ab85-433a-9305-f157654c6755","Type":"ContainerStarted","Data":"39e4a5dd247f1fd39c0c03e7bcfdcef931aa0252c15b21dab6778e09320b593e"} Sep 30 19:51:46 crc kubenswrapper[4553]: I0930 19:51:46.174257 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e815e92c-4105-40fd-90e6-a17d35cdf5c6","Type":"ContainerStarted","Data":"eb287f454b318c2dd22a1b4d2a259feed5b9b0ca8eef9a11fcbb589e4912d85e"} Sep 30 19:51:46 crc kubenswrapper[4553]: I0930 19:51:46.294158 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xf5kh"] Sep 30 19:51:47 crc kubenswrapper[4553]: I0930 19:51:47.200243 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" event={"ID":"7f99a4a3-362d-4fbc-a960-4d1048895160","Type":"ContainerStarted","Data":"ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322"} Sep 30 19:51:47 crc kubenswrapper[4553]: I0930 19:51:47.201540 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:47 crc kubenswrapper[4553]: I0930 19:51:47.217809 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" event={"ID":"1722205d-27ba-4709-bca4-744114e7f16f","Type":"ContainerStarted","Data":"fcfafa21a55fc9e7e9060e92f2c6b1daf85988e60cfcdc2eaec386173ead039a"} Sep 30 19:51:47 crc kubenswrapper[4553]: I0930 19:51:47.218115 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" event={"ID":"1722205d-27ba-4709-bca4-744114e7f16f","Type":"ContainerStarted","Data":"0f9886fffaf5f6f33b57b4d926129a18d3a132d37ba5131394e7c87a126a7848"} Sep 30 19:51:47 crc kubenswrapper[4553]: I0930 19:51:47.251723 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" podStartSLOduration=3.251706053 podStartE2EDuration="3.251706053s" podCreationTimestamp="2025-09-30 19:51:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:51:47.234264596 +0000 UTC m=+1160.433766716" watchObservedRunningTime="2025-09-30 19:51:47.251706053 +0000 UTC m=+1160.451208183" Sep 30 19:51:47 crc kubenswrapper[4553]: I0930 19:51:47.280068 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" podStartSLOduration=2.280032094 podStartE2EDuration="2.280032094s" podCreationTimestamp="2025-09-30 19:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:51:47.250502932 +0000 UTC m=+1160.450005062" watchObservedRunningTime="2025-09-30 19:51:47.280032094 +0000 UTC m=+1160.479534224" Sep 30 19:51:47 crc kubenswrapper[4553]: I0930 19:51:47.917492 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:51:47 crc kubenswrapper[4553]: I0930 19:51:47.951955 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.259814 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"10d8e670-bb79-46a5-885a-35deb8d0ab28","Type":"ContainerStarted","Data":"e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b"} Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.262857 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83fbb883-5c68-4d9e-b446-5c4292bfd3d6","Type":"ContainerStarted","Data":"c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220"} Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.262883 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83fbb883-5c68-4d9e-b446-5c4292bfd3d6","Type":"ContainerStarted","Data":"a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713"} Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.265191 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0ab72afb-ab85-433a-9305-f157654c6755" containerName="nova-metadata-log" containerID="cri-o://9c8d7f6d01def9ad09a60511557c8db1fbf0d4ea1f9af0ed4d07576f4ca432c4" gracePeriod=30 Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.265401 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ab72afb-ab85-433a-9305-f157654c6755","Type":"ContainerStarted","Data":"d78d1d1f459881839fa467b986a4dbaf103aec56942f9b32ece2210fd2581a7f"} Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.265422 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ab72afb-ab85-433a-9305-f157654c6755","Type":"ContainerStarted","Data":"9c8d7f6d01def9ad09a60511557c8db1fbf0d4ea1f9af0ed4d07576f4ca432c4"} Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.265464 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0ab72afb-ab85-433a-9305-f157654c6755" containerName="nova-metadata-metadata" containerID="cri-o://d78d1d1f459881839fa467b986a4dbaf103aec56942f9b32ece2210fd2581a7f" gracePeriod=30 Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.267172 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e815e92c-4105-40fd-90e6-a17d35cdf5c6","Type":"ContainerStarted","Data":"18b5c0222ed74fbf6f8a18c5ac204b7f68e6dee3cbf8f2012f9b821cebde9fd3"} Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.267247 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e815e92c-4105-40fd-90e6-a17d35cdf5c6" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://18b5c0222ed74fbf6f8a18c5ac204b7f68e6dee3cbf8f2012f9b821cebde9fd3" gracePeriod=30 Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.282204 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.387069016 podStartE2EDuration="8.282189139s" podCreationTimestamp="2025-09-30 19:51:43 +0000 UTC" firstStartedPulling="2025-09-30 19:51:44.721502827 +0000 UTC m=+1157.921004957" lastFinishedPulling="2025-09-30 19:51:50.61662296 +0000 UTC m=+1163.816125080" observedRunningTime="2025-09-30 19:51:51.278012157 +0000 UTC m=+1164.477514287" watchObservedRunningTime="2025-09-30 19:51:51.282189139 +0000 UTC m=+1164.481691269" Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.320020 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.851617874 podStartE2EDuration="8.320000784s" podCreationTimestamp="2025-09-30 19:51:43 +0000 UTC" firstStartedPulling="2025-09-30 19:51:45.154695032 +0000 UTC m=+1158.354197172" lastFinishedPulling="2025-09-30 19:51:50.623077952 +0000 UTC m=+1163.822580082" observedRunningTime="2025-09-30 19:51:51.307192621 +0000 UTC m=+1164.506694751" watchObservedRunningTime="2025-09-30 19:51:51.320000784 +0000 UTC m=+1164.519502914" Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.340583 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.717768102 podStartE2EDuration="8.340565266s" podCreationTimestamp="2025-09-30 19:51:43 +0000 UTC" firstStartedPulling="2025-09-30 19:51:44.998820029 +0000 UTC m=+1158.198322159" lastFinishedPulling="2025-09-30 19:51:50.621617193 +0000 UTC m=+1163.821119323" observedRunningTime="2025-09-30 19:51:51.331452561 +0000 UTC m=+1164.530954691" watchObservedRunningTime="2025-09-30 19:51:51.340565266 +0000 UTC m=+1164.540067396" Sep 30 19:51:51 crc kubenswrapper[4553]: I0930 19:51:51.356546 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.930076288 podStartE2EDuration="8.356526774s" podCreationTimestamp="2025-09-30 19:51:43 +0000 UTC" firstStartedPulling="2025-09-30 19:51:45.194239363 +0000 UTC m=+1158.393741493" lastFinishedPulling="2025-09-30 19:51:50.620689849 +0000 UTC m=+1163.820191979" observedRunningTime="2025-09-30 19:51:51.351802368 +0000 UTC m=+1164.551304498" watchObservedRunningTime="2025-09-30 19:51:51.356526774 +0000 UTC m=+1164.556028904" Sep 30 19:51:52 crc kubenswrapper[4553]: I0930 19:51:52.278127 4553 generic.go:334] "Generic (PLEG): container finished" podID="0ab72afb-ab85-433a-9305-f157654c6755" containerID="9c8d7f6d01def9ad09a60511557c8db1fbf0d4ea1f9af0ed4d07576f4ca432c4" exitCode=143 Sep 30 19:51:52 crc kubenswrapper[4553]: I0930 19:51:52.278173 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ab72afb-ab85-433a-9305-f157654c6755","Type":"ContainerDied","Data":"9c8d7f6d01def9ad09a60511557c8db1fbf0d4ea1f9af0ed4d07576f4ca432c4"} Sep 30 19:51:53 crc kubenswrapper[4553]: I0930 19:51:53.743450 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 19:51:53 crc kubenswrapper[4553]: I0930 19:51:53.743854 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 19:51:53 crc kubenswrapper[4553]: I0930 19:51:53.773660 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.310823 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.310878 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.319188 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.319238 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.326836 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.350196 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.612191 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.667514 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fk7nz"] Sep 30 19:51:54 crc kubenswrapper[4553]: I0930 19:51:54.667751 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" podUID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" containerName="dnsmasq-dns" containerID="cri-o://daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98" gracePeriod=10 Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.287785 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.311503 4553 generic.go:334] "Generic (PLEG): container finished" podID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" containerID="daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98" exitCode=0 Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.311795 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.312106 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" event={"ID":"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2","Type":"ContainerDied","Data":"daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98"} Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.312176 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-fk7nz" event={"ID":"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2","Type":"ContainerDied","Data":"94d848d60f1db24cc760317d8999e82eaf2a8a0f6f7adc47bf0a020216b77214"} Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.312197 4553 scope.go:117] "RemoveContainer" containerID="daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.369781 4553 scope.go:117] "RemoveContainer" containerID="b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.410853 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb84d\" (UniqueName: \"kubernetes.io/projected/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-kube-api-access-pb84d\") pod \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.410929 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-swift-storage-0\") pod \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.411002 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-sb\") pod \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.411090 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-config\") pod \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.411127 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-svc\") pod \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.411170 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-nb\") pod \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\" (UID: \"79a2a72b-5ab3-4251-86d3-cdd8966dd5a2\") " Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.417772 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.418148 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.453759 4553 scope.go:117] "RemoveContainer" containerID="daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.470215 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-kube-api-access-pb84d" (OuterVolumeSpecName: "kube-api-access-pb84d") pod "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" (UID: "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2"). InnerVolumeSpecName "kube-api-access-pb84d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:55 crc kubenswrapper[4553]: E0930 19:51:55.470340 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98\": container with ID starting with daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98 not found: ID does not exist" containerID="daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.470372 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98"} err="failed to get container status \"daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98\": rpc error: code = NotFound desc = could not find container \"daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98\": container with ID starting with daa54f87540d5848f02895dca26447bec227591679bc085a6a421daed3ae1f98 not found: ID does not exist" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.470395 4553 scope.go:117] "RemoveContainer" containerID="b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220" Sep 30 19:51:55 crc kubenswrapper[4553]: E0930 19:51:55.470939 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220\": container with ID starting with b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220 not found: ID does not exist" containerID="b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.470958 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220"} err="failed to get container status \"b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220\": rpc error: code = NotFound desc = could not find container \"b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220\": container with ID starting with b25f38208409c6530e26f5ca28f0cc22d73a56943e6f4c96987700870d0de220 not found: ID does not exist" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.514255 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb84d\" (UniqueName: \"kubernetes.io/projected/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-kube-api-access-pb84d\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.543510 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" (UID: "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.600294 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-config" (OuterVolumeSpecName: "config") pod "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" (UID: "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.601369 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" (UID: "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.624915 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.625079 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.625138 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.631690 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" (UID: "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.631758 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" (UID: "79a2a72b-5ab3-4251-86d3-cdd8966dd5a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.727362 4553 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.727387 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.940572 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fk7nz"] Sep 30 19:51:55 crc kubenswrapper[4553]: I0930 19:51:55.952463 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-fk7nz"] Sep 30 19:51:57 crc kubenswrapper[4553]: I0930 19:51:57.329412 4553 generic.go:334] "Generic (PLEG): container finished" podID="321c9e7b-0cfd-440b-a1c9-664990e119c5" containerID="2b7e088975f22e9c523ab829ac918e8e87d330179b98e3ef7184a1d1c89d5256" exitCode=0 Sep 30 19:51:57 crc kubenswrapper[4553]: I0930 19:51:57.329679 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ghdd6" event={"ID":"321c9e7b-0cfd-440b-a1c9-664990e119c5","Type":"ContainerDied","Data":"2b7e088975f22e9c523ab829ac918e8e87d330179b98e3ef7184a1d1c89d5256"} Sep 30 19:51:57 crc kubenswrapper[4553]: I0930 19:51:57.544466 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" path="/var/lib/kubelet/pods/79a2a72b-5ab3-4251-86d3-cdd8966dd5a2/volumes" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.339434 4553 generic.go:334] "Generic (PLEG): container finished" podID="1722205d-27ba-4709-bca4-744114e7f16f" containerID="fcfafa21a55fc9e7e9060e92f2c6b1daf85988e60cfcdc2eaec386173ead039a" exitCode=0 Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.339534 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" event={"ID":"1722205d-27ba-4709-bca4-744114e7f16f","Type":"ContainerDied","Data":"fcfafa21a55fc9e7e9060e92f2c6b1daf85988e60cfcdc2eaec386173ead039a"} Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.702789 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.787558 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hkw7\" (UniqueName: \"kubernetes.io/projected/321c9e7b-0cfd-440b-a1c9-664990e119c5-kube-api-access-4hkw7\") pod \"321c9e7b-0cfd-440b-a1c9-664990e119c5\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.787600 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-combined-ca-bundle\") pod \"321c9e7b-0cfd-440b-a1c9-664990e119c5\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.787690 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-scripts\") pod \"321c9e7b-0cfd-440b-a1c9-664990e119c5\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.787747 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-config-data\") pod \"321c9e7b-0cfd-440b-a1c9-664990e119c5\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.808317 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-scripts" (OuterVolumeSpecName: "scripts") pod "321c9e7b-0cfd-440b-a1c9-664990e119c5" (UID: "321c9e7b-0cfd-440b-a1c9-664990e119c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.818200 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/321c9e7b-0cfd-440b-a1c9-664990e119c5-kube-api-access-4hkw7" (OuterVolumeSpecName: "kube-api-access-4hkw7") pod "321c9e7b-0cfd-440b-a1c9-664990e119c5" (UID: "321c9e7b-0cfd-440b-a1c9-664990e119c5"). InnerVolumeSpecName "kube-api-access-4hkw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.882321 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-config-data" (OuterVolumeSpecName: "config-data") pod "321c9e7b-0cfd-440b-a1c9-664990e119c5" (UID: "321c9e7b-0cfd-440b-a1c9-664990e119c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.889212 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "321c9e7b-0cfd-440b-a1c9-664990e119c5" (UID: "321c9e7b-0cfd-440b-a1c9-664990e119c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.889786 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-combined-ca-bundle\") pod \"321c9e7b-0cfd-440b-a1c9-664990e119c5\" (UID: \"321c9e7b-0cfd-440b-a1c9-664990e119c5\") " Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.890333 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.890349 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.890358 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hkw7\" (UniqueName: \"kubernetes.io/projected/321c9e7b-0cfd-440b-a1c9-664990e119c5-kube-api-access-4hkw7\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:58 crc kubenswrapper[4553]: W0930 19:51:58.890435 4553 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/321c9e7b-0cfd-440b-a1c9-664990e119c5/volumes/kubernetes.io~secret/combined-ca-bundle Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.890446 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "321c9e7b-0cfd-440b-a1c9-664990e119c5" (UID: "321c9e7b-0cfd-440b-a1c9-664990e119c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:58 crc kubenswrapper[4553]: I0930 19:51:58.991671 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321c9e7b-0cfd-440b-a1c9-664990e119c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.355185 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ghdd6" event={"ID":"321c9e7b-0cfd-440b-a1c9-664990e119c5","Type":"ContainerDied","Data":"9c37bf92d3d997a08a1965b8cafd3efba0c5ba484ef388e64996181ee186eec2"} Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.355215 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ghdd6" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.355229 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c37bf92d3d997a08a1965b8cafd3efba0c5ba484ef388e64996181ee186eec2" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.554546 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.554845 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-log" containerID="cri-o://a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713" gracePeriod=30 Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.555324 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-api" containerID="cri-o://c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220" gracePeriod=30 Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.572443 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.572639 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="10d8e670-bb79-46a5-885a-35deb8d0ab28" containerName="nova-scheduler-scheduler" containerID="cri-o://e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b" gracePeriod=30 Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.585346 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.585383 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.585433 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.586084 4553 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a59ed9a27838f8357f3a7a080d587703e9b1aa4272b3bbad7477f76d8c23eba2"} pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.586126 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" containerID="cri-o://a59ed9a27838f8357f3a7a080d587703e9b1aa4272b3bbad7477f76d8c23eba2" gracePeriod=600 Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.836515 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.906125 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-scripts\") pod \"1722205d-27ba-4709-bca4-744114e7f16f\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.906222 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-config-data\") pod \"1722205d-27ba-4709-bca4-744114e7f16f\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.906326 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-combined-ca-bundle\") pod \"1722205d-27ba-4709-bca4-744114e7f16f\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.906364 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8q2x\" (UniqueName: \"kubernetes.io/projected/1722205d-27ba-4709-bca4-744114e7f16f-kube-api-access-g8q2x\") pod \"1722205d-27ba-4709-bca4-744114e7f16f\" (UID: \"1722205d-27ba-4709-bca4-744114e7f16f\") " Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.912953 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1722205d-27ba-4709-bca4-744114e7f16f-kube-api-access-g8q2x" (OuterVolumeSpecName: "kube-api-access-g8q2x") pod "1722205d-27ba-4709-bca4-744114e7f16f" (UID: "1722205d-27ba-4709-bca4-744114e7f16f"). InnerVolumeSpecName "kube-api-access-g8q2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.928122 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-scripts" (OuterVolumeSpecName: "scripts") pod "1722205d-27ba-4709-bca4-744114e7f16f" (UID: "1722205d-27ba-4709-bca4-744114e7f16f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.943408 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1722205d-27ba-4709-bca4-744114e7f16f" (UID: "1722205d-27ba-4709-bca4-744114e7f16f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:51:59 crc kubenswrapper[4553]: I0930 19:51:59.953971 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-config-data" (OuterVolumeSpecName: "config-data") pod "1722205d-27ba-4709-bca4-744114e7f16f" (UID: "1722205d-27ba-4709-bca4-744114e7f16f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.008238 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.008285 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.008296 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1722205d-27ba-4709-bca4-744114e7f16f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.008307 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8q2x\" (UniqueName: \"kubernetes.io/projected/1722205d-27ba-4709-bca4-744114e7f16f-kube-api-access-g8q2x\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.365426 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" event={"ID":"1722205d-27ba-4709-bca4-744114e7f16f","Type":"ContainerDied","Data":"0f9886fffaf5f6f33b57b4d926129a18d3a132d37ba5131394e7c87a126a7848"} Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.366653 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9886fffaf5f6f33b57b4d926129a18d3a132d37ba5131394e7c87a126a7848" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.366541 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xf5kh" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.370456 4553 generic.go:334] "Generic (PLEG): container finished" podID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerID="a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713" exitCode=143 Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.371338 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83fbb883-5c68-4d9e-b446-5c4292bfd3d6","Type":"ContainerDied","Data":"a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713"} Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.387230 4553 generic.go:334] "Generic (PLEG): container finished" podID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerID="a59ed9a27838f8357f3a7a080d587703e9b1aa4272b3bbad7477f76d8c23eba2" exitCode=0 Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.387280 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerDied","Data":"a59ed9a27838f8357f3a7a080d587703e9b1aa4272b3bbad7477f76d8c23eba2"} Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.387308 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"c7864ee52b427b57981d569d4ee7a9292f56eb6909fb29d851a6775585474b37"} Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.387335 4553 scope.go:117] "RemoveContainer" containerID="6c53001a48c79a1addca634bfcf9ef4be43fc5d44c498f0ba986c32047fcaed3" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.466123 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 19:52:00 crc kubenswrapper[4553]: E0930 19:52:00.466608 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" containerName="init" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.466631 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" containerName="init" Sep 30 19:52:00 crc kubenswrapper[4553]: E0930 19:52:00.466654 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="321c9e7b-0cfd-440b-a1c9-664990e119c5" containerName="nova-manage" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.466664 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="321c9e7b-0cfd-440b-a1c9-664990e119c5" containerName="nova-manage" Sep 30 19:52:00 crc kubenswrapper[4553]: E0930 19:52:00.466678 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1722205d-27ba-4709-bca4-744114e7f16f" containerName="nova-cell1-conductor-db-sync" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.466687 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="1722205d-27ba-4709-bca4-744114e7f16f" containerName="nova-cell1-conductor-db-sync" Sep 30 19:52:00 crc kubenswrapper[4553]: E0930 19:52:00.466704 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" containerName="dnsmasq-dns" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.466711 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" containerName="dnsmasq-dns" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.466924 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="1722205d-27ba-4709-bca4-744114e7f16f" containerName="nova-cell1-conductor-db-sync" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.466954 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a2a72b-5ab3-4251-86d3-cdd8966dd5a2" containerName="dnsmasq-dns" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.466969 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="321c9e7b-0cfd-440b-a1c9-664990e119c5" containerName="nova-manage" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.468892 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.471270 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.482690 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.618992 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.619118 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ql4\" (UniqueName: \"kubernetes.io/projected/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-kube-api-access-47ql4\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.619208 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.721507 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ql4\" (UniqueName: \"kubernetes.io/projected/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-kube-api-access-47ql4\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.721663 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.721813 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.728964 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.729009 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.739715 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ql4\" (UniqueName: \"kubernetes.io/projected/12a2252d-7bc1-4a07-ae99-6fbfa13df27f-kube-api-access-47ql4\") pod \"nova-cell1-conductor-0\" (UID: \"12a2252d-7bc1-4a07-ae99-6fbfa13df27f\") " pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:00 crc kubenswrapper[4553]: I0930 19:52:00.807617 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:01 crc kubenswrapper[4553]: I0930 19:52:01.270721 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 19:52:01 crc kubenswrapper[4553]: I0930 19:52:01.395750 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"12a2252d-7bc1-4a07-ae99-6fbfa13df27f","Type":"ContainerStarted","Data":"49de4a4c8184832dffa4754c2215c879bf10ac12c95194602a1c4a0cc043bd75"} Sep 30 19:52:02 crc kubenswrapper[4553]: I0930 19:52:02.407550 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"12a2252d-7bc1-4a07-ae99-6fbfa13df27f","Type":"ContainerStarted","Data":"05622c6e72b16782ac3c0a329e683246dba3dfb23250aefc9ef76f6195e70979"} Sep 30 19:52:02 crc kubenswrapper[4553]: I0930 19:52:02.407980 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:02 crc kubenswrapper[4553]: I0930 19:52:02.423906 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.4238889 podStartE2EDuration="2.4238889s" podCreationTimestamp="2025-09-30 19:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:02.422804341 +0000 UTC m=+1175.622306471" watchObservedRunningTime="2025-09-30 19:52:02.4238889 +0000 UTC m=+1175.623391030" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.295065 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.302344 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.415777 4553 generic.go:334] "Generic (PLEG): container finished" podID="10d8e670-bb79-46a5-885a-35deb8d0ab28" containerID="e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b" exitCode=0 Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.415859 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.416427 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"10d8e670-bb79-46a5-885a-35deb8d0ab28","Type":"ContainerDied","Data":"e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b"} Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.416450 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"10d8e670-bb79-46a5-885a-35deb8d0ab28","Type":"ContainerDied","Data":"c40a4eb0d54b6f23b1dc70b4ae0ba5c1555cd27df47c7eb38a6031c2ae210e86"} Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.416464 4553 scope.go:117] "RemoveContainer" containerID="e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.418801 4553 generic.go:334] "Generic (PLEG): container finished" podID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerID="c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220" exitCode=0 Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.419617 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.419790 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83fbb883-5c68-4d9e-b446-5c4292bfd3d6","Type":"ContainerDied","Data":"c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220"} Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.419812 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83fbb883-5c68-4d9e-b446-5c4292bfd3d6","Type":"ContainerDied","Data":"fe0098b7d8a0f7f8d4b4e356d590ae4552f39c190b0fda8af89d03a9231401fe"} Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.435990 4553 scope.go:117] "RemoveContainer" containerID="e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b" Sep 30 19:52:03 crc kubenswrapper[4553]: E0930 19:52:03.436400 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b\": container with ID starting with e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b not found: ID does not exist" containerID="e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.436428 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b"} err="failed to get container status \"e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b\": rpc error: code = NotFound desc = could not find container \"e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b\": container with ID starting with e5109d2517a778088e682564f53811975696b0f2074f761d6c51c6f6f732e92b not found: ID does not exist" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.436445 4553 scope.go:117] "RemoveContainer" containerID="c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.454529 4553 scope.go:117] "RemoveContainer" containerID="a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.472581 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkstj\" (UniqueName: \"kubernetes.io/projected/10d8e670-bb79-46a5-885a-35deb8d0ab28-kube-api-access-xkstj\") pod \"10d8e670-bb79-46a5-885a-35deb8d0ab28\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.472685 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-config-data\") pod \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.472703 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-combined-ca-bundle\") pod \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.472727 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-config-data\") pod \"10d8e670-bb79-46a5-885a-35deb8d0ab28\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.472768 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bmll\" (UniqueName: \"kubernetes.io/projected/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-kube-api-access-9bmll\") pod \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.472876 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-combined-ca-bundle\") pod \"10d8e670-bb79-46a5-885a-35deb8d0ab28\" (UID: \"10d8e670-bb79-46a5-885a-35deb8d0ab28\") " Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.472902 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-logs\") pod \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\" (UID: \"83fbb883-5c68-4d9e-b446-5c4292bfd3d6\") " Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.473688 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-logs" (OuterVolumeSpecName: "logs") pod "83fbb883-5c68-4d9e-b446-5c4292bfd3d6" (UID: "83fbb883-5c68-4d9e-b446-5c4292bfd3d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.474505 4553 scope.go:117] "RemoveContainer" containerID="c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220" Sep 30 19:52:03 crc kubenswrapper[4553]: E0930 19:52:03.474936 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220\": container with ID starting with c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220 not found: ID does not exist" containerID="c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.474976 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220"} err="failed to get container status \"c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220\": rpc error: code = NotFound desc = could not find container \"c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220\": container with ID starting with c6dfbae57ed4136250db6cf42686ee3a6de70fe6bd915627ab00c75817774220 not found: ID does not exist" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.474997 4553 scope.go:117] "RemoveContainer" containerID="a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713" Sep 30 19:52:03 crc kubenswrapper[4553]: E0930 19:52:03.475283 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713\": container with ID starting with a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713 not found: ID does not exist" containerID="a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.475325 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713"} err="failed to get container status \"a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713\": rpc error: code = NotFound desc = could not find container \"a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713\": container with ID starting with a012b724ec906199c087f28f2f9293005e9498fe1e6d976268e860299e5ef713 not found: ID does not exist" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.482314 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-kube-api-access-9bmll" (OuterVolumeSpecName: "kube-api-access-9bmll") pod "83fbb883-5c68-4d9e-b446-5c4292bfd3d6" (UID: "83fbb883-5c68-4d9e-b446-5c4292bfd3d6"). InnerVolumeSpecName "kube-api-access-9bmll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.482430 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d8e670-bb79-46a5-885a-35deb8d0ab28-kube-api-access-xkstj" (OuterVolumeSpecName: "kube-api-access-xkstj") pod "10d8e670-bb79-46a5-885a-35deb8d0ab28" (UID: "10d8e670-bb79-46a5-885a-35deb8d0ab28"). InnerVolumeSpecName "kube-api-access-xkstj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.506075 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10d8e670-bb79-46a5-885a-35deb8d0ab28" (UID: "10d8e670-bb79-46a5-885a-35deb8d0ab28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.506166 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-config-data" (OuterVolumeSpecName: "config-data") pod "83fbb883-5c68-4d9e-b446-5c4292bfd3d6" (UID: "83fbb883-5c68-4d9e-b446-5c4292bfd3d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.511442 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-config-data" (OuterVolumeSpecName: "config-data") pod "10d8e670-bb79-46a5-885a-35deb8d0ab28" (UID: "10d8e670-bb79-46a5-885a-35deb8d0ab28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.524507 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83fbb883-5c68-4d9e-b446-5c4292bfd3d6" (UID: "83fbb883-5c68-4d9e-b446-5c4292bfd3d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.575414 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkstj\" (UniqueName: \"kubernetes.io/projected/10d8e670-bb79-46a5-885a-35deb8d0ab28-kube-api-access-xkstj\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.575467 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.575477 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.575486 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.575494 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bmll\" (UniqueName: \"kubernetes.io/projected/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-kube-api-access-9bmll\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.575506 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d8e670-bb79-46a5-885a-35deb8d0ab28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.575517 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83fbb883-5c68-4d9e-b446-5c4292bfd3d6-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.736289 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.760240 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.772442 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:03 crc kubenswrapper[4553]: E0930 19:52:03.773092 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-api" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.773116 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-api" Sep 30 19:52:03 crc kubenswrapper[4553]: E0930 19:52:03.773161 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d8e670-bb79-46a5-885a-35deb8d0ab28" containerName="nova-scheduler-scheduler" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.773168 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d8e670-bb79-46a5-885a-35deb8d0ab28" containerName="nova-scheduler-scheduler" Sep 30 19:52:03 crc kubenswrapper[4553]: E0930 19:52:03.773177 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-log" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.773184 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-log" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.773525 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d8e670-bb79-46a5-885a-35deb8d0ab28" containerName="nova-scheduler-scheduler" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.773552 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-api" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.773573 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" containerName="nova-api-log" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.774457 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.783152 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.797085 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.823881 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.842413 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.866102 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.867691 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.869845 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.871348 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.888577 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.888620 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-config-data\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.888678 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvt4q\" (UniqueName: \"kubernetes.io/projected/b9385676-e090-4851-b31b-ccbc62073e7f-kube-api-access-fvt4q\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.990332 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.990399 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.990423 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-config-data\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.990484 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-config-data\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.990508 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvt4q\" (UniqueName: \"kubernetes.io/projected/b9385676-e090-4851-b31b-ccbc62073e7f-kube-api-access-fvt4q\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.990539 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5sxc\" (UniqueName: \"kubernetes.io/projected/c7e3cc5f-ee64-4139-87b0-e709083511be-kube-api-access-p5sxc\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.990586 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e3cc5f-ee64-4139-87b0-e709083511be-logs\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.998751 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-config-data\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:03 crc kubenswrapper[4553]: I0930 19:52:03.999386 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.011772 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvt4q\" (UniqueName: \"kubernetes.io/projected/b9385676-e090-4851-b31b-ccbc62073e7f-kube-api-access-fvt4q\") pod \"nova-scheduler-0\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.092395 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e3cc5f-ee64-4139-87b0-e709083511be-logs\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.092486 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.092538 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-config-data\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.092576 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5sxc\" (UniqueName: \"kubernetes.io/projected/c7e3cc5f-ee64-4139-87b0-e709083511be-kube-api-access-p5sxc\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.093285 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e3cc5f-ee64-4139-87b0-e709083511be-logs\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.096469 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.096609 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.097351 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-config-data\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.125776 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5sxc\" (UniqueName: \"kubernetes.io/projected/c7e3cc5f-ee64-4139-87b0-e709083511be-kube-api-access-p5sxc\") pod \"nova-api-0\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.187813 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.554854 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:04 crc kubenswrapper[4553]: W0930 19:52:04.558668 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9385676_e090_4851_b31b_ccbc62073e7f.slice/crio-9b9ebca4efea0d77e4a81aa726c75a5f8f3156e35d4139d10623b648666f44fa WatchSource:0}: Error finding container 9b9ebca4efea0d77e4a81aa726c75a5f8f3156e35d4139d10623b648666f44fa: Status 404 returned error can't find the container with id 9b9ebca4efea0d77e4a81aa726c75a5f8f3156e35d4139d10623b648666f44fa Sep 30 19:52:04 crc kubenswrapper[4553]: I0930 19:52:04.651443 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:04 crc kubenswrapper[4553]: W0930 19:52:04.660136 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e3cc5f_ee64_4139_87b0_e709083511be.slice/crio-26242e5912a557098d98da0db6374ffdfb7c808dd4c01603c5a95790a8be07af WatchSource:0}: Error finding container 26242e5912a557098d98da0db6374ffdfb7c808dd4c01603c5a95790a8be07af: Status 404 returned error can't find the container with id 26242e5912a557098d98da0db6374ffdfb7c808dd4c01603c5a95790a8be07af Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.463564 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c7e3cc5f-ee64-4139-87b0-e709083511be","Type":"ContainerStarted","Data":"f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a"} Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.464121 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c7e3cc5f-ee64-4139-87b0-e709083511be","Type":"ContainerStarted","Data":"2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36"} Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.464142 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c7e3cc5f-ee64-4139-87b0-e709083511be","Type":"ContainerStarted","Data":"26242e5912a557098d98da0db6374ffdfb7c808dd4c01603c5a95790a8be07af"} Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.469820 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b9385676-e090-4851-b31b-ccbc62073e7f","Type":"ContainerStarted","Data":"702791343bb82bcb9384f423eb17f07a702728dd2c8d42f5c007c841faf6960c"} Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.469911 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b9385676-e090-4851-b31b-ccbc62073e7f","Type":"ContainerStarted","Data":"9b9ebca4efea0d77e4a81aa726c75a5f8f3156e35d4139d10623b648666f44fa"} Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.505731 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.505717069 podStartE2EDuration="2.505717069s" podCreationTimestamp="2025-09-30 19:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:05.501162816 +0000 UTC m=+1178.700664946" watchObservedRunningTime="2025-09-30 19:52:05.505717069 +0000 UTC m=+1178.705219199" Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.507686 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.507680921 podStartE2EDuration="2.507680921s" podCreationTimestamp="2025-09-30 19:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:05.48975421 +0000 UTC m=+1178.689256340" watchObservedRunningTime="2025-09-30 19:52:05.507680921 +0000 UTC m=+1178.707183051" Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.518467 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d8e670-bb79-46a5-885a-35deb8d0ab28" path="/var/lib/kubelet/pods/10d8e670-bb79-46a5-885a-35deb8d0ab28/volumes" Sep 30 19:52:05 crc kubenswrapper[4553]: I0930 19:52:05.519006 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83fbb883-5c68-4d9e-b446-5c4292bfd3d6" path="/var/lib/kubelet/pods/83fbb883-5c68-4d9e-b446-5c4292bfd3d6/volumes" Sep 30 19:52:08 crc kubenswrapper[4553]: I0930 19:52:08.391385 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 19:52:09 crc kubenswrapper[4553]: I0930 19:52:09.097171 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 19:52:10 crc kubenswrapper[4553]: I0930 19:52:10.835388 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 19:52:14 crc kubenswrapper[4553]: I0930 19:52:14.097234 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 19:52:14 crc kubenswrapper[4553]: I0930 19:52:14.144164 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 19:52:14 crc kubenswrapper[4553]: I0930 19:52:14.190021 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:52:14 crc kubenswrapper[4553]: I0930 19:52:14.190099 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:52:14 crc kubenswrapper[4553]: I0930 19:52:14.596668 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 19:52:15 crc kubenswrapper[4553]: I0930 19:52:15.273262 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:52:15 crc kubenswrapper[4553]: I0930 19:52:15.273271 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.660734 4553 generic.go:334] "Generic (PLEG): container finished" podID="0ab72afb-ab85-433a-9305-f157654c6755" containerID="d78d1d1f459881839fa467b986a4dbaf103aec56942f9b32ece2210fd2581a7f" exitCode=137 Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.661621 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ab72afb-ab85-433a-9305-f157654c6755","Type":"ContainerDied","Data":"d78d1d1f459881839fa467b986a4dbaf103aec56942f9b32ece2210fd2581a7f"} Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.667359 4553 generic.go:334] "Generic (PLEG): container finished" podID="e815e92c-4105-40fd-90e6-a17d35cdf5c6" containerID="18b5c0222ed74fbf6f8a18c5ac204b7f68e6dee3cbf8f2012f9b821cebde9fd3" exitCode=137 Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.667533 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e815e92c-4105-40fd-90e6-a17d35cdf5c6","Type":"ContainerDied","Data":"18b5c0222ed74fbf6f8a18c5ac204b7f68e6dee3cbf8f2012f9b821cebde9fd3"} Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.803856 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.809191 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.957601 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-config-data\") pod \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.957664 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ab72afb-ab85-433a-9305-f157654c6755-logs\") pod \"0ab72afb-ab85-433a-9305-f157654c6755\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.957771 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-combined-ca-bundle\") pod \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.957800 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frqxz\" (UniqueName: \"kubernetes.io/projected/0ab72afb-ab85-433a-9305-f157654c6755-kube-api-access-frqxz\") pod \"0ab72afb-ab85-433a-9305-f157654c6755\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.957864 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jt7l\" (UniqueName: \"kubernetes.io/projected/e815e92c-4105-40fd-90e6-a17d35cdf5c6-kube-api-access-4jt7l\") pod \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\" (UID: \"e815e92c-4105-40fd-90e6-a17d35cdf5c6\") " Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.957952 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-config-data\") pod \"0ab72afb-ab85-433a-9305-f157654c6755\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.957981 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-combined-ca-bundle\") pod \"0ab72afb-ab85-433a-9305-f157654c6755\" (UID: \"0ab72afb-ab85-433a-9305-f157654c6755\") " Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.958794 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab72afb-ab85-433a-9305-f157654c6755-logs" (OuterVolumeSpecName: "logs") pod "0ab72afb-ab85-433a-9305-f157654c6755" (UID: "0ab72afb-ab85-433a-9305-f157654c6755"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.964411 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab72afb-ab85-433a-9305-f157654c6755-kube-api-access-frqxz" (OuterVolumeSpecName: "kube-api-access-frqxz") pod "0ab72afb-ab85-433a-9305-f157654c6755" (UID: "0ab72afb-ab85-433a-9305-f157654c6755"). InnerVolumeSpecName "kube-api-access-frqxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.971521 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e815e92c-4105-40fd-90e6-a17d35cdf5c6-kube-api-access-4jt7l" (OuterVolumeSpecName: "kube-api-access-4jt7l") pod "e815e92c-4105-40fd-90e6-a17d35cdf5c6" (UID: "e815e92c-4105-40fd-90e6-a17d35cdf5c6"). InnerVolumeSpecName "kube-api-access-4jt7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.985306 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-config-data" (OuterVolumeSpecName: "config-data") pod "e815e92c-4105-40fd-90e6-a17d35cdf5c6" (UID: "e815e92c-4105-40fd-90e6-a17d35cdf5c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.988474 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-config-data" (OuterVolumeSpecName: "config-data") pod "0ab72afb-ab85-433a-9305-f157654c6755" (UID: "0ab72afb-ab85-433a-9305-f157654c6755"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:21 crc kubenswrapper[4553]: I0930 19:52:21.992374 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e815e92c-4105-40fd-90e6-a17d35cdf5c6" (UID: "e815e92c-4105-40fd-90e6-a17d35cdf5c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.000700 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ab72afb-ab85-433a-9305-f157654c6755" (UID: "0ab72afb-ab85-433a-9305-f157654c6755"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.059599 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jt7l\" (UniqueName: \"kubernetes.io/projected/e815e92c-4105-40fd-90e6-a17d35cdf5c6-kube-api-access-4jt7l\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.059638 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.059652 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab72afb-ab85-433a-9305-f157654c6755-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.059663 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.059674 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ab72afb-ab85-433a-9305-f157654c6755-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.059687 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e815e92c-4105-40fd-90e6-a17d35cdf5c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.059696 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frqxz\" (UniqueName: \"kubernetes.io/projected/0ab72afb-ab85-433a-9305-f157654c6755-kube-api-access-frqxz\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.678544 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ab72afb-ab85-433a-9305-f157654c6755","Type":"ContainerDied","Data":"39e4a5dd247f1fd39c0c03e7bcfdcef931aa0252c15b21dab6778e09320b593e"} Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.678574 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.678876 4553 scope.go:117] "RemoveContainer" containerID="d78d1d1f459881839fa467b986a4dbaf103aec56942f9b32ece2210fd2581a7f" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.681310 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e815e92c-4105-40fd-90e6-a17d35cdf5c6","Type":"ContainerDied","Data":"eb287f454b318c2dd22a1b4d2a259feed5b9b0ca8eef9a11fcbb589e4912d85e"} Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.681348 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.725100 4553 scope.go:117] "RemoveContainer" containerID="9c8d7f6d01def9ad09a60511557c8db1fbf0d4ea1f9af0ed4d07576f4ca432c4" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.730335 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.747870 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.763119 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:52:22 crc kubenswrapper[4553]: E0930 19:52:22.763680 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab72afb-ab85-433a-9305-f157654c6755" containerName="nova-metadata-log" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.763702 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab72afb-ab85-433a-9305-f157654c6755" containerName="nova-metadata-log" Sep 30 19:52:22 crc kubenswrapper[4553]: E0930 19:52:22.763728 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e815e92c-4105-40fd-90e6-a17d35cdf5c6" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.763737 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e815e92c-4105-40fd-90e6-a17d35cdf5c6" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 19:52:22 crc kubenswrapper[4553]: E0930 19:52:22.763765 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab72afb-ab85-433a-9305-f157654c6755" containerName="nova-metadata-metadata" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.763774 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab72afb-ab85-433a-9305-f157654c6755" containerName="nova-metadata-metadata" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.764006 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e815e92c-4105-40fd-90e6-a17d35cdf5c6" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.764031 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab72afb-ab85-433a-9305-f157654c6755" containerName="nova-metadata-metadata" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.764071 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab72afb-ab85-433a-9305-f157654c6755" containerName="nova-metadata-log" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.764906 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.774708 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.774917 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.775053 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.775414 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.790014 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.809426 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.829598 4553 scope.go:117] "RemoveContainer" containerID="18b5c0222ed74fbf6f8a18c5ac204b7f68e6dee3cbf8f2012f9b821cebde9fd3" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.833436 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.835217 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.841419 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.841590 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.877618 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.879571 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.879658 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.879690 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.879760 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.879818 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgvrr\" (UniqueName: \"kubernetes.io/projected/105050db-f2fe-48cc-ac77-649e4f2f2a83-kube-api-access-lgvrr\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984293 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984380 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-config-data\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984407 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984431 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984455 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwfgq\" (UniqueName: \"kubernetes.io/projected/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-kube-api-access-pwfgq\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984480 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984533 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984574 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgvrr\" (UniqueName: \"kubernetes.io/projected/105050db-f2fe-48cc-ac77-649e4f2f2a83-kube-api-access-lgvrr\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984605 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-logs\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:22 crc kubenswrapper[4553]: I0930 19:52:22.984620 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.005111 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.010190 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.019148 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.024757 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/105050db-f2fe-48cc-ac77-649e4f2f2a83-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.033385 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgvrr\" (UniqueName: \"kubernetes.io/projected/105050db-f2fe-48cc-ac77-649e4f2f2a83-kube-api-access-lgvrr\") pod \"nova-cell1-novncproxy-0\" (UID: \"105050db-f2fe-48cc-ac77-649e4f2f2a83\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.085674 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.085724 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwfgq\" (UniqueName: \"kubernetes.io/projected/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-kube-api-access-pwfgq\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.085822 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-logs\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.085840 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.085890 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-config-data\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.088511 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-logs\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.089391 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-config-data\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.095484 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.095706 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.102859 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.107868 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwfgq\" (UniqueName: \"kubernetes.io/projected/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-kube-api-access-pwfgq\") pod \"nova-metadata-0\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.187454 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.517331 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab72afb-ab85-433a-9305-f157654c6755" path="/var/lib/kubelet/pods/0ab72afb-ab85-433a-9305-f157654c6755/volumes" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.518838 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e815e92c-4105-40fd-90e6-a17d35cdf5c6" path="/var/lib/kubelet/pods/e815e92c-4105-40fd-90e6-a17d35cdf5c6/volumes" Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.579995 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 19:52:23 crc kubenswrapper[4553]: W0930 19:52:23.582123 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod105050db_f2fe_48cc_ac77_649e4f2f2a83.slice/crio-79567a4e78d90d7a03053aa51df9affcfa4b5d24a7b0630218989e9aa333f048 WatchSource:0}: Error finding container 79567a4e78d90d7a03053aa51df9affcfa4b5d24a7b0630218989e9aa333f048: Status 404 returned error can't find the container with id 79567a4e78d90d7a03053aa51df9affcfa4b5d24a7b0630218989e9aa333f048 Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.667799 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:23 crc kubenswrapper[4553]: W0930 19:52:23.670723 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3bd77eb_ad04_4611_a1a7_634cc17aa46c.slice/crio-0d0189d497a7d27187cca2c5bd5ebea3beeb04ec6785b448e61901e3ee1bdd10 WatchSource:0}: Error finding container 0d0189d497a7d27187cca2c5bd5ebea3beeb04ec6785b448e61901e3ee1bdd10: Status 404 returned error can't find the container with id 0d0189d497a7d27187cca2c5bd5ebea3beeb04ec6785b448e61901e3ee1bdd10 Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.692229 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"105050db-f2fe-48cc-ac77-649e4f2f2a83","Type":"ContainerStarted","Data":"79567a4e78d90d7a03053aa51df9affcfa4b5d24a7b0630218989e9aa333f048"} Sep 30 19:52:23 crc kubenswrapper[4553]: I0930 19:52:23.694120 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3bd77eb-ad04-4611-a1a7-634cc17aa46c","Type":"ContainerStarted","Data":"0d0189d497a7d27187cca2c5bd5ebea3beeb04ec6785b448e61901e3ee1bdd10"} Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.205287 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.205625 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.205978 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.206016 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.209421 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.211406 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.460679 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xhbcp"] Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.462174 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.473729 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xhbcp"] Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.621769 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.621838 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-config\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.622211 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.622266 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.622410 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.622473 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2txt\" (UniqueName: \"kubernetes.io/projected/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-kube-api-access-j2txt\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.705088 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3bd77eb-ad04-4611-a1a7-634cc17aa46c","Type":"ContainerStarted","Data":"869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88"} Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.705146 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3bd77eb-ad04-4611-a1a7-634cc17aa46c","Type":"ContainerStarted","Data":"0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62"} Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.706741 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"105050db-f2fe-48cc-ac77-649e4f2f2a83","Type":"ContainerStarted","Data":"c27eb006c44f03992b691b9cf1360f7e1ff2a54c87e512b3bcb23c6a2a5f3d52"} Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.724580 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-config\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.724743 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.724770 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.724806 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.724831 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2txt\" (UniqueName: \"kubernetes.io/projected/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-kube-api-access-j2txt\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.724894 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.725987 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.726086 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.726135 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.726229 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.726343 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-config\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.727670 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.727659196 podStartE2EDuration="2.727659196s" podCreationTimestamp="2025-09-30 19:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:24.720590007 +0000 UTC m=+1197.920092157" watchObservedRunningTime="2025-09-30 19:52:24.727659196 +0000 UTC m=+1197.927161326" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.750656 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2txt\" (UniqueName: \"kubernetes.io/projected/7a1c1e1d-56a5-4748-8ca3-e210541bbcbe-kube-api-access-j2txt\") pod \"dnsmasq-dns-89c5cd4d5-xhbcp\" (UID: \"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe\") " pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.767506 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.767485105 podStartE2EDuration="2.767485105s" podCreationTimestamp="2025-09-30 19:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:24.746702278 +0000 UTC m=+1197.946204408" watchObservedRunningTime="2025-09-30 19:52:24.767485105 +0000 UTC m=+1197.966987235" Sep 30 19:52:24 crc kubenswrapper[4553]: I0930 19:52:24.777740 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:25 crc kubenswrapper[4553]: I0930 19:52:25.288703 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-xhbcp"] Sep 30 19:52:25 crc kubenswrapper[4553]: I0930 19:52:25.715661 4553 generic.go:334] "Generic (PLEG): container finished" podID="7a1c1e1d-56a5-4748-8ca3-e210541bbcbe" containerID="c2496bbf4cf8a10769a4a971ab1db06ab6abd9298ed6391748bc4590823066f8" exitCode=0 Sep 30 19:52:25 crc kubenswrapper[4553]: I0930 19:52:25.717622 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" event={"ID":"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe","Type":"ContainerDied","Data":"c2496bbf4cf8a10769a4a971ab1db06ab6abd9298ed6391748bc4590823066f8"} Sep 30 19:52:25 crc kubenswrapper[4553]: I0930 19:52:25.717653 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" event={"ID":"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe","Type":"ContainerStarted","Data":"26d90f0cd3f3266c5c907f6c015404f0c53b5e21a10b86fa6e82dbcabdf880c5"} Sep 30 19:52:26 crc kubenswrapper[4553]: I0930 19:52:26.724275 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" event={"ID":"7a1c1e1d-56a5-4748-8ca3-e210541bbcbe","Type":"ContainerStarted","Data":"160b65d6fdc783a8135e59a56e5184795918f5d4634cfadd1b3383262a3e95ac"} Sep 30 19:52:26 crc kubenswrapper[4553]: I0930 19:52:26.724530 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:26 crc kubenswrapper[4553]: I0930 19:52:26.742530 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" podStartSLOduration=2.742515954 podStartE2EDuration="2.742515954s" podCreationTimestamp="2025-09-30 19:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:26.739480732 +0000 UTC m=+1199.938982862" watchObservedRunningTime="2025-09-30 19:52:26.742515954 +0000 UTC m=+1199.942018084" Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.265840 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.266421 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="ceilometer-central-agent" containerID="cri-o://7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a" gracePeriod=30 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.266435 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="proxy-httpd" containerID="cri-o://1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78" gracePeriod=30 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.266474 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="sg-core" containerID="cri-o://79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429" gracePeriod=30 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.266579 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="ceilometer-notification-agent" containerID="cri-o://694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1" gracePeriod=30 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.420296 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.420488 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-log" containerID="cri-o://2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36" gracePeriod=30 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.420570 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-api" containerID="cri-o://f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a" gracePeriod=30 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.734580 4553 generic.go:334] "Generic (PLEG): container finished" podID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerID="2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36" exitCode=143 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.734714 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c7e3cc5f-ee64-4139-87b0-e709083511be","Type":"ContainerDied","Data":"2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36"} Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.737103 4553 generic.go:334] "Generic (PLEG): container finished" podID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerID="1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78" exitCode=0 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.737124 4553 generic.go:334] "Generic (PLEG): container finished" podID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerID="79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429" exitCode=2 Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.737986 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerDied","Data":"1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78"} Sep 30 19:52:27 crc kubenswrapper[4553]: I0930 19:52:27.738015 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerDied","Data":"79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429"} Sep 30 19:52:28 crc kubenswrapper[4553]: I0930 19:52:28.103931 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:28 crc kubenswrapper[4553]: I0930 19:52:28.188464 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:52:28 crc kubenswrapper[4553]: I0930 19:52:28.188681 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:52:28 crc kubenswrapper[4553]: I0930 19:52:28.748060 4553 generic.go:334] "Generic (PLEG): container finished" podID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerID="7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a" exitCode=0 Sep 30 19:52:28 crc kubenswrapper[4553]: I0930 19:52:28.748751 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerDied","Data":"7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a"} Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.316811 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.432695 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-config-data\") pod \"c7e3cc5f-ee64-4139-87b0-e709083511be\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.432987 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e3cc5f-ee64-4139-87b0-e709083511be-logs\") pod \"c7e3cc5f-ee64-4139-87b0-e709083511be\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.433188 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-combined-ca-bundle\") pod \"c7e3cc5f-ee64-4139-87b0-e709083511be\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.433301 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5sxc\" (UniqueName: \"kubernetes.io/projected/c7e3cc5f-ee64-4139-87b0-e709083511be-kube-api-access-p5sxc\") pod \"c7e3cc5f-ee64-4139-87b0-e709083511be\" (UID: \"c7e3cc5f-ee64-4139-87b0-e709083511be\") " Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.433410 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e3cc5f-ee64-4139-87b0-e709083511be-logs" (OuterVolumeSpecName: "logs") pod "c7e3cc5f-ee64-4139-87b0-e709083511be" (UID: "c7e3cc5f-ee64-4139-87b0-e709083511be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.433893 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e3cc5f-ee64-4139-87b0-e709083511be-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.453310 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e3cc5f-ee64-4139-87b0-e709083511be-kube-api-access-p5sxc" (OuterVolumeSpecName: "kube-api-access-p5sxc") pod "c7e3cc5f-ee64-4139-87b0-e709083511be" (UID: "c7e3cc5f-ee64-4139-87b0-e709083511be"). InnerVolumeSpecName "kube-api-access-p5sxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.464164 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7e3cc5f-ee64-4139-87b0-e709083511be" (UID: "c7e3cc5f-ee64-4139-87b0-e709083511be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.525331 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-config-data" (OuterVolumeSpecName: "config-data") pod "c7e3cc5f-ee64-4139-87b0-e709083511be" (UID: "c7e3cc5f-ee64-4139-87b0-e709083511be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.537342 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.537501 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5sxc\" (UniqueName: \"kubernetes.io/projected/c7e3cc5f-ee64-4139-87b0-e709083511be-kube-api-access-p5sxc\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.537574 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e3cc5f-ee64-4139-87b0-e709083511be-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.777469 4553 generic.go:334] "Generic (PLEG): container finished" podID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerID="f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a" exitCode=0 Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.777512 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c7e3cc5f-ee64-4139-87b0-e709083511be","Type":"ContainerDied","Data":"f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a"} Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.777539 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c7e3cc5f-ee64-4139-87b0-e709083511be","Type":"ContainerDied","Data":"26242e5912a557098d98da0db6374ffdfb7c808dd4c01603c5a95790a8be07af"} Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.777556 4553 scope.go:117] "RemoveContainer" containerID="f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.777701 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.806991 4553 scope.go:117] "RemoveContainer" containerID="2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.812793 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.821463 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.828005 4553 scope.go:117] "RemoveContainer" containerID="f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a" Sep 30 19:52:31 crc kubenswrapper[4553]: E0930 19:52:31.828626 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a\": container with ID starting with f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a not found: ID does not exist" containerID="f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.828656 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a"} err="failed to get container status \"f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a\": rpc error: code = NotFound desc = could not find container \"f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a\": container with ID starting with f5f061c2c0e68b3d50b45e1597909802d19c24ffafdc12636bbfac93a3f0260a not found: ID does not exist" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.828678 4553 scope.go:117] "RemoveContainer" containerID="2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36" Sep 30 19:52:31 crc kubenswrapper[4553]: E0930 19:52:31.828925 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36\": container with ID starting with 2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36 not found: ID does not exist" containerID="2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.828970 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36"} err="failed to get container status \"2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36\": rpc error: code = NotFound desc = could not find container \"2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36\": container with ID starting with 2d035bca3ab6c63ddb84471dbe7b57ef1eccc6d666b2be033a6ef74697cc3a36 not found: ID does not exist" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.837092 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:31 crc kubenswrapper[4553]: E0930 19:52:31.837486 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-api" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.837503 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-api" Sep 30 19:52:31 crc kubenswrapper[4553]: E0930 19:52:31.837515 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-log" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.837520 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-log" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.837694 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-api" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.837710 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" containerName="nova-api-log" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.839237 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.841420 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.841520 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.849681 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.852550 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.945600 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4db\" (UniqueName: \"kubernetes.io/projected/bf6fed13-2a04-4e0f-8c60-15b54bd77477-kube-api-access-2m4db\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.945653 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.945711 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf6fed13-2a04-4e0f-8c60-15b54bd77477-logs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.945768 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-public-tls-certs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.945791 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:31 crc kubenswrapper[4553]: I0930 19:52:31.945815 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-config-data\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.047279 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-public-tls-certs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.047325 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.047347 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-config-data\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.047396 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4db\" (UniqueName: \"kubernetes.io/projected/bf6fed13-2a04-4e0f-8c60-15b54bd77477-kube-api-access-2m4db\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.047424 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.047473 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf6fed13-2a04-4e0f-8c60-15b54bd77477-logs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.047848 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf6fed13-2a04-4e0f-8c60-15b54bd77477-logs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.051370 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.052099 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.053123 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-config-data\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.054453 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-public-tls-certs\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.065586 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4db\" (UniqueName: \"kubernetes.io/projected/bf6fed13-2a04-4e0f-8c60-15b54bd77477-kube-api-access-2m4db\") pod \"nova-api-0\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.154802 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.442111 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.721447 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.808907 4553 generic.go:334] "Generic (PLEG): container finished" podID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerID="694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1" exitCode=0 Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.808983 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerDied","Data":"694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1"} Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.809011 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4d66447-e03a-4cc3-9cf6-c99358e848de","Type":"ContainerDied","Data":"19ddf8debbfcffeed13df609b2cc33cdc644e7ccb46d5db89009f09d5f4e44a0"} Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.809029 4553 scope.go:117] "RemoveContainer" containerID="1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.809164 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.817001 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf6fed13-2a04-4e0f-8c60-15b54bd77477","Type":"ContainerStarted","Data":"501fe93336568e02fdaa1a023efb24d8b4e7a08ea1fa0b91426fea2c897dd08f"} Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.839874 4553 scope.go:117] "RemoveContainer" containerID="79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.869269 4553 scope.go:117] "RemoveContainer" containerID="694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.870656 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-run-httpd\") pod \"f4d66447-e03a-4cc3-9cf6-c99358e848de\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.870822 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-config-data\") pod \"f4d66447-e03a-4cc3-9cf6-c99358e848de\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.870904 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-combined-ca-bundle\") pod \"f4d66447-e03a-4cc3-9cf6-c99358e848de\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.870927 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-log-httpd\") pod \"f4d66447-e03a-4cc3-9cf6-c99358e848de\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.870950 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsfnl\" (UniqueName: \"kubernetes.io/projected/f4d66447-e03a-4cc3-9cf6-c99358e848de-kube-api-access-lsfnl\") pod \"f4d66447-e03a-4cc3-9cf6-c99358e848de\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.870977 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-scripts\") pod \"f4d66447-e03a-4cc3-9cf6-c99358e848de\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.872154 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f4d66447-e03a-4cc3-9cf6-c99358e848de" (UID: "f4d66447-e03a-4cc3-9cf6-c99358e848de"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.872281 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f4d66447-e03a-4cc3-9cf6-c99358e848de" (UID: "f4d66447-e03a-4cc3-9cf6-c99358e848de"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.872547 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-ceilometer-tls-certs\") pod \"f4d66447-e03a-4cc3-9cf6-c99358e848de\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.872826 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-sg-core-conf-yaml\") pod \"f4d66447-e03a-4cc3-9cf6-c99358e848de\" (UID: \"f4d66447-e03a-4cc3-9cf6-c99358e848de\") " Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.873439 4553 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.873456 4553 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4d66447-e03a-4cc3-9cf6-c99358e848de-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.879762 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-scripts" (OuterVolumeSpecName: "scripts") pod "f4d66447-e03a-4cc3-9cf6-c99358e848de" (UID: "f4d66447-e03a-4cc3-9cf6-c99358e848de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.881347 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d66447-e03a-4cc3-9cf6-c99358e848de-kube-api-access-lsfnl" (OuterVolumeSpecName: "kube-api-access-lsfnl") pod "f4d66447-e03a-4cc3-9cf6-c99358e848de" (UID: "f4d66447-e03a-4cc3-9cf6-c99358e848de"). InnerVolumeSpecName "kube-api-access-lsfnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.905953 4553 scope.go:117] "RemoveContainer" containerID="7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.926547 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f4d66447-e03a-4cc3-9cf6-c99358e848de" (UID: "f4d66447-e03a-4cc3-9cf6-c99358e848de"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.930025 4553 scope.go:117] "RemoveContainer" containerID="1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78" Sep 30 19:52:32 crc kubenswrapper[4553]: E0930 19:52:32.930351 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78\": container with ID starting with 1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78 not found: ID does not exist" containerID="1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.934147 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78"} err="failed to get container status \"1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78\": rpc error: code = NotFound desc = could not find container \"1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78\": container with ID starting with 1044986b3a47f86a9c4d0d63401ad968abe8b000ff5048d7711ba66e6e737c78 not found: ID does not exist" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.934200 4553 scope.go:117] "RemoveContainer" containerID="79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429" Sep 30 19:52:32 crc kubenswrapper[4553]: E0930 19:52:32.939590 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429\": container with ID starting with 79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429 not found: ID does not exist" containerID="79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.939631 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429"} err="failed to get container status \"79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429\": rpc error: code = NotFound desc = could not find container \"79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429\": container with ID starting with 79722e5da2e24c11b7382ddee77cd86ea214e0a1196856aea5d13ab73f89a429 not found: ID does not exist" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.939658 4553 scope.go:117] "RemoveContainer" containerID="694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1" Sep 30 19:52:32 crc kubenswrapper[4553]: E0930 19:52:32.941129 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1\": container with ID starting with 694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1 not found: ID does not exist" containerID="694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.941159 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1"} err="failed to get container status \"694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1\": rpc error: code = NotFound desc = could not find container \"694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1\": container with ID starting with 694a314055389753e23170211f8a634c89f3c2b0d2a10dd7d9422a6d88cb48d1 not found: ID does not exist" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.941174 4553 scope.go:117] "RemoveContainer" containerID="7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a" Sep 30 19:52:32 crc kubenswrapper[4553]: E0930 19:52:32.941790 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a\": container with ID starting with 7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a not found: ID does not exist" containerID="7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.941849 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a"} err="failed to get container status \"7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a\": rpc error: code = NotFound desc = could not find container \"7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a\": container with ID starting with 7998ae701e2817d1976708dfd8779c9125a906db5e6c124370d2f68dc1a59d5a not found: ID does not exist" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.944233 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f4d66447-e03a-4cc3-9cf6-c99358e848de" (UID: "f4d66447-e03a-4cc3-9cf6-c99358e848de"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.975184 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsfnl\" (UniqueName: \"kubernetes.io/projected/f4d66447-e03a-4cc3-9cf6-c99358e848de-kube-api-access-lsfnl\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.975595 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.975672 4553 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.975741 4553 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:32 crc kubenswrapper[4553]: I0930 19:52:32.981396 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4d66447-e03a-4cc3-9cf6-c99358e848de" (UID: "f4d66447-e03a-4cc3-9cf6-c99358e848de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.011917 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-config-data" (OuterVolumeSpecName: "config-data") pod "f4d66447-e03a-4cc3-9cf6-c99358e848de" (UID: "f4d66447-e03a-4cc3-9cf6-c99358e848de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.077606 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.077633 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4d66447-e03a-4cc3-9cf6-c99358e848de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.103787 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.124556 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.167527 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.173263 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182002 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:52:33 crc kubenswrapper[4553]: E0930 19:52:33.182400 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="sg-core" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182416 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="sg-core" Sep 30 19:52:33 crc kubenswrapper[4553]: E0930 19:52:33.182428 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="proxy-httpd" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182434 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="proxy-httpd" Sep 30 19:52:33 crc kubenswrapper[4553]: E0930 19:52:33.182446 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="ceilometer-notification-agent" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182452 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="ceilometer-notification-agent" Sep 30 19:52:33 crc kubenswrapper[4553]: E0930 19:52:33.182464 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="ceilometer-central-agent" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182469 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="ceilometer-central-agent" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182650 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="ceilometer-central-agent" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182668 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="ceilometer-notification-agent" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182685 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="proxy-httpd" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.182696 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" containerName="sg-core" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.184272 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.186778 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.187099 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.187362 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.188420 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.188515 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.192814 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.281164 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-log-httpd\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.281227 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.281255 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-config-data\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.281276 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.281316 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-run-httpd\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.281341 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-scripts\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.281376 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ljrl\" (UniqueName: \"kubernetes.io/projected/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-kube-api-access-8ljrl\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.281424 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.383279 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ljrl\" (UniqueName: \"kubernetes.io/projected/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-kube-api-access-8ljrl\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.383602 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.383765 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-log-httpd\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.384194 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.384570 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-config-data\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.384663 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.384760 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-run-httpd\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.384842 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-scripts\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.384150 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-log-httpd\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.385566 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-run-httpd\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.387760 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.388648 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.388949 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-config-data\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.389389 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.390527 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-scripts\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.410500 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ljrl\" (UniqueName: \"kubernetes.io/projected/b964830e-4e1f-449d-bee4-fa7ed59b7ffc-kube-api-access-8ljrl\") pod \"ceilometer-0\" (UID: \"b964830e-4e1f-449d-bee4-fa7ed59b7ffc\") " pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.513840 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e3cc5f-ee64-4139-87b0-e709083511be" path="/var/lib/kubelet/pods/c7e3cc5f-ee64-4139-87b0-e709083511be/volumes" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.514521 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d66447-e03a-4cc3-9cf6-c99358e848de" path="/var/lib/kubelet/pods/f4d66447-e03a-4cc3-9cf6-c99358e848de/volumes" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.514939 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.828138 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf6fed13-2a04-4e0f-8c60-15b54bd77477","Type":"ContainerStarted","Data":"9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69"} Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.828389 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf6fed13-2a04-4e0f-8c60-15b54bd77477","Type":"ContainerStarted","Data":"1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c"} Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.862143 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8621230840000003 podStartE2EDuration="2.862123084s" podCreationTimestamp="2025-09-30 19:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:33.853296988 +0000 UTC m=+1207.052799118" watchObservedRunningTime="2025-09-30 19:52:33.862123084 +0000 UTC m=+1207.061625214" Sep 30 19:52:33 crc kubenswrapper[4553]: I0930 19:52:33.870115 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.013389 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.052239 4553 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.127490 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-sqkv9"] Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.129488 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.134742 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.135097 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.179267 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sqkv9"] Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.202281 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.202293 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.209266 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-config-data\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.209397 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.209422 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzdp5\" (UniqueName: \"kubernetes.io/projected/a4567a10-e62e-4839-b8d8-ca0207fe6341-kube-api-access-nzdp5\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.209480 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-scripts\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.310516 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-config-data\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.310667 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.310692 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzdp5\" (UniqueName: \"kubernetes.io/projected/a4567a10-e62e-4839-b8d8-ca0207fe6341-kube-api-access-nzdp5\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.311512 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-scripts\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.317726 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-config-data\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.320103 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-scripts\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.327634 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.332734 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzdp5\" (UniqueName: \"kubernetes.io/projected/a4567a10-e62e-4839-b8d8-ca0207fe6341-kube-api-access-nzdp5\") pod \"nova-cell1-cell-mapping-sqkv9\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.452656 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.779154 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-xhbcp" Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.844637 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-bvvj2"] Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.844861 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" podUID="7f99a4a3-362d-4fbc-a960-4d1048895160" containerName="dnsmasq-dns" containerID="cri-o://ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322" gracePeriod=10 Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.852804 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b964830e-4e1f-449d-bee4-fa7ed59b7ffc","Type":"ContainerStarted","Data":"93a37c8d5c3d02c715a413f70eb8cb1cd024e958a04121b2bdb756e8c2ef86da"} Sep 30 19:52:34 crc kubenswrapper[4553]: I0930 19:52:34.938496 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sqkv9"] Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.598144 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.639596 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-config\") pod \"7f99a4a3-362d-4fbc-a960-4d1048895160\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.639760 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-swift-storage-0\") pod \"7f99a4a3-362d-4fbc-a960-4d1048895160\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.639787 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-nb\") pod \"7f99a4a3-362d-4fbc-a960-4d1048895160\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.639810 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-svc\") pod \"7f99a4a3-362d-4fbc-a960-4d1048895160\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.639848 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq9ll\" (UniqueName: \"kubernetes.io/projected/7f99a4a3-362d-4fbc-a960-4d1048895160-kube-api-access-gq9ll\") pod \"7f99a4a3-362d-4fbc-a960-4d1048895160\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.639930 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-sb\") pod \"7f99a4a3-362d-4fbc-a960-4d1048895160\" (UID: \"7f99a4a3-362d-4fbc-a960-4d1048895160\") " Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.662235 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f99a4a3-362d-4fbc-a960-4d1048895160-kube-api-access-gq9ll" (OuterVolumeSpecName: "kube-api-access-gq9ll") pod "7f99a4a3-362d-4fbc-a960-4d1048895160" (UID: "7f99a4a3-362d-4fbc-a960-4d1048895160"). InnerVolumeSpecName "kube-api-access-gq9ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.743146 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq9ll\" (UniqueName: \"kubernetes.io/projected/7f99a4a3-362d-4fbc-a960-4d1048895160-kube-api-access-gq9ll\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.800653 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-config" (OuterVolumeSpecName: "config") pod "7f99a4a3-362d-4fbc-a960-4d1048895160" (UID: "7f99a4a3-362d-4fbc-a960-4d1048895160"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.816364 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7f99a4a3-362d-4fbc-a960-4d1048895160" (UID: "7f99a4a3-362d-4fbc-a960-4d1048895160"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.820838 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f99a4a3-362d-4fbc-a960-4d1048895160" (UID: "7f99a4a3-362d-4fbc-a960-4d1048895160"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.822842 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f99a4a3-362d-4fbc-a960-4d1048895160" (UID: "7f99a4a3-362d-4fbc-a960-4d1048895160"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.837855 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f99a4a3-362d-4fbc-a960-4d1048895160" (UID: "7f99a4a3-362d-4fbc-a960-4d1048895160"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.844145 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.844167 4553 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-config\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.844177 4553 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.844186 4553 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.844194 4553 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f99a4a3-362d-4fbc-a960-4d1048895160-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.863597 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sqkv9" event={"ID":"a4567a10-e62e-4839-b8d8-ca0207fe6341","Type":"ContainerStarted","Data":"47673b7e82979fac1b9428efd6e1a5d8361133b7f7c8081154d0532a0ac3394e"} Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.863864 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sqkv9" event={"ID":"a4567a10-e62e-4839-b8d8-ca0207fe6341","Type":"ContainerStarted","Data":"c3d4e89358ba69c76d55769fc07dc7b132570cbe14b8fe0eb9f4a286981cc3e8"} Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.866727 4553 generic.go:334] "Generic (PLEG): container finished" podID="7f99a4a3-362d-4fbc-a960-4d1048895160" containerID="ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322" exitCode=0 Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.866769 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" event={"ID":"7f99a4a3-362d-4fbc-a960-4d1048895160","Type":"ContainerDied","Data":"ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322"} Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.866786 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" event={"ID":"7f99a4a3-362d-4fbc-a960-4d1048895160","Type":"ContainerDied","Data":"31716bff2ab1519fc8ac08d6cbf474f4ab7c04b2a757584e8f3f3098e96d6ca2"} Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.866806 4553 scope.go:117] "RemoveContainer" containerID="ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.866887 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-bvvj2" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.870420 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b964830e-4e1f-449d-bee4-fa7ed59b7ffc","Type":"ContainerStarted","Data":"d972944651215b4f7b4e9dbf8396f363e963cc49fc3de769dde5164e68a9dc28"} Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.870460 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b964830e-4e1f-449d-bee4-fa7ed59b7ffc","Type":"ContainerStarted","Data":"4d80ed0127654a2f76db2353e29ad8672515b62b7a4929a342dc38049d3b3fdd"} Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.882076 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-sqkv9" podStartSLOduration=1.882056848 podStartE2EDuration="1.882056848s" podCreationTimestamp="2025-09-30 19:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:35.881371319 +0000 UTC m=+1209.080873449" watchObservedRunningTime="2025-09-30 19:52:35.882056848 +0000 UTC m=+1209.081558978" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.912766 4553 scope.go:117] "RemoveContainer" containerID="6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.960288 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-bvvj2"] Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.963568 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-bvvj2"] Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.976542 4553 scope.go:117] "RemoveContainer" containerID="ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322" Sep 30 19:52:35 crc kubenswrapper[4553]: E0930 19:52:35.977137 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322\": container with ID starting with ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322 not found: ID does not exist" containerID="ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.977189 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322"} err="failed to get container status \"ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322\": rpc error: code = NotFound desc = could not find container \"ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322\": container with ID starting with ed549b92646be6141c1321c490975129afcf400a75ef1ce3571e532b9b0e3322 not found: ID does not exist" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.977216 4553 scope.go:117] "RemoveContainer" containerID="6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf" Sep 30 19:52:35 crc kubenswrapper[4553]: E0930 19:52:35.981559 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf\": container with ID starting with 6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf not found: ID does not exist" containerID="6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf" Sep 30 19:52:35 crc kubenswrapper[4553]: I0930 19:52:35.981589 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf"} err="failed to get container status \"6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf\": rpc error: code = NotFound desc = could not find container \"6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf\": container with ID starting with 6bf3b823f872851de9bac893d429526851a9bb42d8a2f3e36ede8a9e59368ddf not found: ID does not exist" Sep 30 19:52:36 crc kubenswrapper[4553]: I0930 19:52:36.883215 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b964830e-4e1f-449d-bee4-fa7ed59b7ffc","Type":"ContainerStarted","Data":"6ecc25b44c8e306b365eb87ad8c4b908e4d7065e89faf8420e972659674cc06e"} Sep 30 19:52:37 crc kubenswrapper[4553]: I0930 19:52:37.534446 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f99a4a3-362d-4fbc-a960-4d1048895160" path="/var/lib/kubelet/pods/7f99a4a3-362d-4fbc-a960-4d1048895160/volumes" Sep 30 19:52:38 crc kubenswrapper[4553]: I0930 19:52:38.900686 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b964830e-4e1f-449d-bee4-fa7ed59b7ffc","Type":"ContainerStarted","Data":"0eb6e4f60ec8c17eb09aaa76d24e1b3461eb2114d03f25f810cde450b9f7685f"} Sep 30 19:52:38 crc kubenswrapper[4553]: I0930 19:52:38.901084 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 19:52:38 crc kubenswrapper[4553]: I0930 19:52:38.923587 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.258417944 podStartE2EDuration="5.923572005s" podCreationTimestamp="2025-09-30 19:52:33 +0000 UTC" firstStartedPulling="2025-09-30 19:52:34.051951349 +0000 UTC m=+1207.251453489" lastFinishedPulling="2025-09-30 19:52:37.71710541 +0000 UTC m=+1210.916607550" observedRunningTime="2025-09-30 19:52:38.918414406 +0000 UTC m=+1212.117916536" watchObservedRunningTime="2025-09-30 19:52:38.923572005 +0000 UTC m=+1212.123074135" Sep 30 19:52:41 crc kubenswrapper[4553]: I0930 19:52:41.927746 4553 generic.go:334] "Generic (PLEG): container finished" podID="a4567a10-e62e-4839-b8d8-ca0207fe6341" containerID="47673b7e82979fac1b9428efd6e1a5d8361133b7f7c8081154d0532a0ac3394e" exitCode=0 Sep 30 19:52:41 crc kubenswrapper[4553]: I0930 19:52:41.929449 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sqkv9" event={"ID":"a4567a10-e62e-4839-b8d8-ca0207fe6341","Type":"ContainerDied","Data":"47673b7e82979fac1b9428efd6e1a5d8361133b7f7c8081154d0532a0ac3394e"} Sep 30 19:52:42 crc kubenswrapper[4553]: I0930 19:52:42.155708 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:52:42 crc kubenswrapper[4553]: I0930 19:52:42.155752 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.173495 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.174053 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.232261 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.234125 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.252962 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.446211 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.591322 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-config-data\") pod \"a4567a10-e62e-4839-b8d8-ca0207fe6341\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.591642 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzdp5\" (UniqueName: \"kubernetes.io/projected/a4567a10-e62e-4839-b8d8-ca0207fe6341-kube-api-access-nzdp5\") pod \"a4567a10-e62e-4839-b8d8-ca0207fe6341\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.591806 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-scripts\") pod \"a4567a10-e62e-4839-b8d8-ca0207fe6341\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.591841 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-combined-ca-bundle\") pod \"a4567a10-e62e-4839-b8d8-ca0207fe6341\" (UID: \"a4567a10-e62e-4839-b8d8-ca0207fe6341\") " Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.611282 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4567a10-e62e-4839-b8d8-ca0207fe6341-kube-api-access-nzdp5" (OuterVolumeSpecName: "kube-api-access-nzdp5") pod "a4567a10-e62e-4839-b8d8-ca0207fe6341" (UID: "a4567a10-e62e-4839-b8d8-ca0207fe6341"). InnerVolumeSpecName "kube-api-access-nzdp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.612678 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-scripts" (OuterVolumeSpecName: "scripts") pod "a4567a10-e62e-4839-b8d8-ca0207fe6341" (UID: "a4567a10-e62e-4839-b8d8-ca0207fe6341"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.638658 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4567a10-e62e-4839-b8d8-ca0207fe6341" (UID: "a4567a10-e62e-4839-b8d8-ca0207fe6341"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.682355 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-config-data" (OuterVolumeSpecName: "config-data") pod "a4567a10-e62e-4839-b8d8-ca0207fe6341" (UID: "a4567a10-e62e-4839-b8d8-ca0207fe6341"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.695357 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.695546 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.695630 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzdp5\" (UniqueName: \"kubernetes.io/projected/a4567a10-e62e-4839-b8d8-ca0207fe6341-kube-api-access-nzdp5\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.695701 4553 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4567a10-e62e-4839-b8d8-ca0207fe6341-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.945985 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sqkv9" event={"ID":"a4567a10-e62e-4839-b8d8-ca0207fe6341","Type":"ContainerDied","Data":"c3d4e89358ba69c76d55769fc07dc7b132570cbe14b8fe0eb9f4a286981cc3e8"} Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.946477 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sqkv9" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.949242 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3d4e89358ba69c76d55769fc07dc7b132570cbe14b8fe0eb9f4a286981cc3e8" Sep 30 19:52:43 crc kubenswrapper[4553]: I0930 19:52:43.955489 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 19:52:44 crc kubenswrapper[4553]: I0930 19:52:44.136245 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:44 crc kubenswrapper[4553]: I0930 19:52:44.136521 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-log" containerID="cri-o://1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c" gracePeriod=30 Sep 30 19:52:44 crc kubenswrapper[4553]: I0930 19:52:44.136949 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-api" containerID="cri-o://9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69" gracePeriod=30 Sep 30 19:52:44 crc kubenswrapper[4553]: I0930 19:52:44.190718 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:44 crc kubenswrapper[4553]: I0930 19:52:44.200330 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:44 crc kubenswrapper[4553]: I0930 19:52:44.200930 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b9385676-e090-4851-b31b-ccbc62073e7f" containerName="nova-scheduler-scheduler" containerID="cri-o://702791343bb82bcb9384f423eb17f07a702728dd2c8d42f5c007c841faf6960c" gracePeriod=30 Sep 30 19:52:44 crc kubenswrapper[4553]: I0930 19:52:44.953892 4553 generic.go:334] "Generic (PLEG): container finished" podID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerID="1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c" exitCode=143 Sep 30 19:52:44 crc kubenswrapper[4553]: I0930 19:52:44.954092 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf6fed13-2a04-4e0f-8c60-15b54bd77477","Type":"ContainerDied","Data":"1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c"} Sep 30 19:52:45 crc kubenswrapper[4553]: I0930 19:52:45.967951 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-log" containerID="cri-o://0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62" gracePeriod=30 Sep 30 19:52:45 crc kubenswrapper[4553]: I0930 19:52:45.968097 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-metadata" containerID="cri-o://869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88" gracePeriod=30 Sep 30 19:52:47 crc kubenswrapper[4553]: I0930 19:52:47.011264 4553 generic.go:334] "Generic (PLEG): container finished" podID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerID="0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62" exitCode=143 Sep 30 19:52:47 crc kubenswrapper[4553]: I0930 19:52:47.011402 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3bd77eb-ad04-4611-a1a7-634cc17aa46c","Type":"ContainerDied","Data":"0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62"} Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.037692 4553 generic.go:334] "Generic (PLEG): container finished" podID="b9385676-e090-4851-b31b-ccbc62073e7f" containerID="702791343bb82bcb9384f423eb17f07a702728dd2c8d42f5c007c841faf6960c" exitCode=0 Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.037773 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b9385676-e090-4851-b31b-ccbc62073e7f","Type":"ContainerDied","Data":"702791343bb82bcb9384f423eb17f07a702728dd2c8d42f5c007c841faf6960c"} Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.212660 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.243391 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-config-data\") pod \"b9385676-e090-4851-b31b-ccbc62073e7f\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.243578 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-combined-ca-bundle\") pod \"b9385676-e090-4851-b31b-ccbc62073e7f\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.243639 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvt4q\" (UniqueName: \"kubernetes.io/projected/b9385676-e090-4851-b31b-ccbc62073e7f-kube-api-access-fvt4q\") pod \"b9385676-e090-4851-b31b-ccbc62073e7f\" (UID: \"b9385676-e090-4851-b31b-ccbc62073e7f\") " Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.250634 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9385676-e090-4851-b31b-ccbc62073e7f-kube-api-access-fvt4q" (OuterVolumeSpecName: "kube-api-access-fvt4q") pod "b9385676-e090-4851-b31b-ccbc62073e7f" (UID: "b9385676-e090-4851-b31b-ccbc62073e7f"). InnerVolumeSpecName "kube-api-access-fvt4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.283179 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9385676-e090-4851-b31b-ccbc62073e7f" (UID: "b9385676-e090-4851-b31b-ccbc62073e7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.286349 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-config-data" (OuterVolumeSpecName: "config-data") pod "b9385676-e090-4851-b31b-ccbc62073e7f" (UID: "b9385676-e090-4851-b31b-ccbc62073e7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.345643 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.345673 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvt4q\" (UniqueName: \"kubernetes.io/projected/b9385676-e090-4851-b31b-ccbc62073e7f-kube-api-access-fvt4q\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.345686 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9385676-e090-4851-b31b-ccbc62073e7f-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:48 crc kubenswrapper[4553]: I0930 19:52:48.960749 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.046676 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b9385676-e090-4851-b31b-ccbc62073e7f","Type":"ContainerDied","Data":"9b9ebca4efea0d77e4a81aa726c75a5f8f3156e35d4139d10623b648666f44fa"} Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.047376 4553 scope.go:117] "RemoveContainer" containerID="702791343bb82bcb9384f423eb17f07a702728dd2c8d42f5c007c841faf6960c" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.047599 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.052534 4553 generic.go:334] "Generic (PLEG): container finished" podID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerID="9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69" exitCode=0 Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.052728 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf6fed13-2a04-4e0f-8c60-15b54bd77477","Type":"ContainerDied","Data":"9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69"} Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.052854 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf6fed13-2a04-4e0f-8c60-15b54bd77477","Type":"ContainerDied","Data":"501fe93336568e02fdaa1a023efb24d8b4e7a08ea1fa0b91426fea2c897dd08f"} Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.052982 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.088235 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.097097 4553 scope.go:117] "RemoveContainer" containerID="9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.098895 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.111319 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:49 crc kubenswrapper[4553]: E0930 19:52:49.111764 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9385676-e090-4851-b31b-ccbc62073e7f" containerName="nova-scheduler-scheduler" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.111776 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9385676-e090-4851-b31b-ccbc62073e7f" containerName="nova-scheduler-scheduler" Sep 30 19:52:49 crc kubenswrapper[4553]: E0930 19:52:49.112186 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4567a10-e62e-4839-b8d8-ca0207fe6341" containerName="nova-manage" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112195 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4567a10-e62e-4839-b8d8-ca0207fe6341" containerName="nova-manage" Sep 30 19:52:49 crc kubenswrapper[4553]: E0930 19:52:49.112214 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f99a4a3-362d-4fbc-a960-4d1048895160" containerName="init" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112220 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f99a4a3-362d-4fbc-a960-4d1048895160" containerName="init" Sep 30 19:52:49 crc kubenswrapper[4553]: E0930 19:52:49.112236 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f99a4a3-362d-4fbc-a960-4d1048895160" containerName="dnsmasq-dns" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112242 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f99a4a3-362d-4fbc-a960-4d1048895160" containerName="dnsmasq-dns" Sep 30 19:52:49 crc kubenswrapper[4553]: E0930 19:52:49.112252 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-log" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112258 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-log" Sep 30 19:52:49 crc kubenswrapper[4553]: E0930 19:52:49.112276 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-api" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112281 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-api" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112450 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9385676-e090-4851-b31b-ccbc62073e7f" containerName="nova-scheduler-scheduler" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112462 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4567a10-e62e-4839-b8d8-ca0207fe6341" containerName="nova-manage" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112473 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-log" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112480 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f99a4a3-362d-4fbc-a960-4d1048895160" containerName="dnsmasq-dns" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.112491 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" containerName="nova-api-api" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.114459 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.117426 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.124570 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.134465 4553 scope.go:117] "RemoveContainer" containerID="1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.156864 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-public-tls-certs\") pod \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.156945 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf6fed13-2a04-4e0f-8c60-15b54bd77477-logs\") pod \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.156977 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-internal-tls-certs\") pod \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.157097 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-combined-ca-bundle\") pod \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.157146 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-config-data\") pod \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.157288 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m4db\" (UniqueName: \"kubernetes.io/projected/bf6fed13-2a04-4e0f-8c60-15b54bd77477-kube-api-access-2m4db\") pod \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\" (UID: \"bf6fed13-2a04-4e0f-8c60-15b54bd77477\") " Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.157571 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6fed13-2a04-4e0f-8c60-15b54bd77477-logs" (OuterVolumeSpecName: "logs") pod "bf6fed13-2a04-4e0f-8c60-15b54bd77477" (UID: "bf6fed13-2a04-4e0f-8c60-15b54bd77477"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.163652 4553 scope.go:117] "RemoveContainer" containerID="9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69" Sep 30 19:52:49 crc kubenswrapper[4553]: E0930 19:52:49.174523 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69\": container with ID starting with 9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69 not found: ID does not exist" containerID="9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.174569 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69"} err="failed to get container status \"9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69\": rpc error: code = NotFound desc = could not find container \"9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69\": container with ID starting with 9e407880b4ffb3ae3d56b6625377727d8b24aa5038142267afe240c0046c2a69 not found: ID does not exist" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.174611 4553 scope.go:117] "RemoveContainer" containerID="1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c" Sep 30 19:52:49 crc kubenswrapper[4553]: E0930 19:52:49.175018 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c\": container with ID starting with 1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c not found: ID does not exist" containerID="1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.175135 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c"} err="failed to get container status \"1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c\": rpc error: code = NotFound desc = could not find container \"1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c\": container with ID starting with 1350e57842d5bc0b0b5f73740902875878971b5b58f3be342a740177a9b6da5c not found: ID does not exist" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.184399 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6fed13-2a04-4e0f-8c60-15b54bd77477-kube-api-access-2m4db" (OuterVolumeSpecName: "kube-api-access-2m4db") pod "bf6fed13-2a04-4e0f-8c60-15b54bd77477" (UID: "bf6fed13-2a04-4e0f-8c60-15b54bd77477"). InnerVolumeSpecName "kube-api-access-2m4db". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.196955 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf6fed13-2a04-4e0f-8c60-15b54bd77477" (UID: "bf6fed13-2a04-4e0f-8c60-15b54bd77477"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.199370 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-config-data" (OuterVolumeSpecName: "config-data") pod "bf6fed13-2a04-4e0f-8c60-15b54bd77477" (UID: "bf6fed13-2a04-4e0f-8c60-15b54bd77477"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.208109 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:53340->10.217.0.199:8775: read: connection reset by peer" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.208432 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:53332->10.217.0.199:8775: read: connection reset by peer" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.215958 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bf6fed13-2a04-4e0f-8c60-15b54bd77477" (UID: "bf6fed13-2a04-4e0f-8c60-15b54bd77477"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.253642 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bf6fed13-2a04-4e0f-8c60-15b54bd77477" (UID: "bf6fed13-2a04-4e0f-8c60-15b54bd77477"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.260180 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec8d773-3769-4cd8-9da4-4696693173fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.260287 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec8d773-3769-4cd8-9da4-4696693173fa-config-data\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.260316 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9b56\" (UniqueName: \"kubernetes.io/projected/0ec8d773-3769-4cd8-9da4-4696693173fa-kube-api-access-k9b56\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.261000 4553 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.261027 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.261081 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.261094 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m4db\" (UniqueName: \"kubernetes.io/projected/bf6fed13-2a04-4e0f-8c60-15b54bd77477-kube-api-access-2m4db\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.261105 4553 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf6fed13-2a04-4e0f-8c60-15b54bd77477-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.261114 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf6fed13-2a04-4e0f-8c60-15b54bd77477-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.363576 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec8d773-3769-4cd8-9da4-4696693173fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.363624 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec8d773-3769-4cd8-9da4-4696693173fa-config-data\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.363648 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9b56\" (UniqueName: \"kubernetes.io/projected/0ec8d773-3769-4cd8-9da4-4696693173fa-kube-api-access-k9b56\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.367701 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ec8d773-3769-4cd8-9da4-4696693173fa-config-data\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.378308 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ec8d773-3769-4cd8-9da4-4696693173fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.392397 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.393401 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9b56\" (UniqueName: \"kubernetes.io/projected/0ec8d773-3769-4cd8-9da4-4696693173fa-kube-api-access-k9b56\") pod \"nova-scheduler-0\" (UID: \"0ec8d773-3769-4cd8-9da4-4696693173fa\") " pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.425809 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.444528 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.446085 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.449537 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.449629 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.449782 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.458659 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.458800 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.538796 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9385676-e090-4851-b31b-ccbc62073e7f" path="/var/lib/kubelet/pods/b9385676-e090-4851-b31b-ccbc62073e7f/volumes" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.539657 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf6fed13-2a04-4e0f-8c60-15b54bd77477" path="/var/lib/kubelet/pods/bf6fed13-2a04-4e0f-8c60-15b54bd77477/volumes" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.552427 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.571259 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.571317 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.571345 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ln5p\" (UniqueName: \"kubernetes.io/projected/d2ac042a-d1af-49f5-b573-f76fc772dabd-kube-api-access-4ln5p\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.571396 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-config-data\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.571590 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:49 crc kubenswrapper[4553]: I0930 19:52:49.571625 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2ac042a-d1af-49f5-b573-f76fc772dabd-logs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.672753 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-config-data\") pod \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.672902 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwfgq\" (UniqueName: \"kubernetes.io/projected/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-kube-api-access-pwfgq\") pod \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.672940 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-nova-metadata-tls-certs\") pod \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.672997 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-logs\") pod \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.673063 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-combined-ca-bundle\") pod \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\" (UID: \"c3bd77eb-ad04-4611-a1a7-634cc17aa46c\") " Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.673336 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.673371 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.673389 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ln5p\" (UniqueName: \"kubernetes.io/projected/d2ac042a-d1af-49f5-b573-f76fc772dabd-kube-api-access-4ln5p\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.673438 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-config-data\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.673554 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.673579 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2ac042a-d1af-49f5-b573-f76fc772dabd-logs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.673866 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-logs" (OuterVolumeSpecName: "logs") pod "c3bd77eb-ad04-4611-a1a7-634cc17aa46c" (UID: "c3bd77eb-ad04-4611-a1a7-634cc17aa46c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.674065 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2ac042a-d1af-49f5-b573-f76fc772dabd-logs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.679810 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-kube-api-access-pwfgq" (OuterVolumeSpecName: "kube-api-access-pwfgq") pod "c3bd77eb-ad04-4611-a1a7-634cc17aa46c" (UID: "c3bd77eb-ad04-4611-a1a7-634cc17aa46c"). InnerVolumeSpecName "kube-api-access-pwfgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.680598 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.690222 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.694266 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-config-data\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.694546 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ln5p\" (UniqueName: \"kubernetes.io/projected/d2ac042a-d1af-49f5-b573-f76fc772dabd-kube-api-access-4ln5p\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.703261 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2ac042a-d1af-49f5-b573-f76fc772dabd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2ac042a-d1af-49f5-b573-f76fc772dabd\") " pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.713971 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-config-data" (OuterVolumeSpecName: "config-data") pod "c3bd77eb-ad04-4611-a1a7-634cc17aa46c" (UID: "c3bd77eb-ad04-4611-a1a7-634cc17aa46c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.734161 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3bd77eb-ad04-4611-a1a7-634cc17aa46c" (UID: "c3bd77eb-ad04-4611-a1a7-634cc17aa46c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.769448 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c3bd77eb-ad04-4611-a1a7-634cc17aa46c" (UID: "c3bd77eb-ad04-4611-a1a7-634cc17aa46c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.775667 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwfgq\" (UniqueName: \"kubernetes.io/projected/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-kube-api-access-pwfgq\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.776218 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.776256 4553 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.776274 4553 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-logs\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.776285 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:49.776294 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3bd77eb-ad04-4611-a1a7-634cc17aa46c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.065438 4553 generic.go:334] "Generic (PLEG): container finished" podID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerID="869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88" exitCode=0 Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.065537 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3bd77eb-ad04-4611-a1a7-634cc17aa46c","Type":"ContainerDied","Data":"869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88"} Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.065585 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.066012 4553 scope.go:117] "RemoveContainer" containerID="869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.065996 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c3bd77eb-ad04-4611-a1a7-634cc17aa46c","Type":"ContainerDied","Data":"0d0189d497a7d27187cca2c5bd5ebea3beeb04ec6785b448e61901e3ee1bdd10"} Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.094478 4553 scope.go:117] "RemoveContainer" containerID="0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.119144 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.134385 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.144086 4553 scope.go:117] "RemoveContainer" containerID="869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88" Sep 30 19:52:50 crc kubenswrapper[4553]: E0930 19:52:50.144598 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88\": container with ID starting with 869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88 not found: ID does not exist" containerID="869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.144635 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88"} err="failed to get container status \"869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88\": rpc error: code = NotFound desc = could not find container \"869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88\": container with ID starting with 869f25a59b6a895a8debbc224ce2d8a88b235eb76dd64121a7d4d2ae7d22ef88 not found: ID does not exist" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.144656 4553 scope.go:117] "RemoveContainer" containerID="0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62" Sep 30 19:52:50 crc kubenswrapper[4553]: E0930 19:52:50.144837 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62\": container with ID starting with 0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62 not found: ID does not exist" containerID="0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.144853 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62"} err="failed to get container status \"0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62\": rpc error: code = NotFound desc = could not find container \"0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62\": container with ID starting with 0f88e3d01038185bc480cf232bdffadaf7a7d6c457084d34a971ee970418fe62 not found: ID does not exist" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.146457 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:50 crc kubenswrapper[4553]: E0930 19:52:50.146897 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-metadata" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.146908 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-metadata" Sep 30 19:52:50 crc kubenswrapper[4553]: E0930 19:52:50.146925 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-log" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.146932 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-log" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.147191 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-log" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.147216 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" containerName="nova-metadata-metadata" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.148132 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.150827 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.151145 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.158305 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.186182 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a9de6d-a606-4cad-9153-8891c6a322a2-logs\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.186318 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.186356 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fkml\" (UniqueName: \"kubernetes.io/projected/48a9de6d-a606-4cad-9153-8891c6a322a2-kube-api-access-5fkml\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.186414 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.186429 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-config-data\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.287774 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.287828 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fkml\" (UniqueName: \"kubernetes.io/projected/48a9de6d-a606-4cad-9153-8891c6a322a2-kube-api-access-5fkml\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.288332 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.288384 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-config-data\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.288466 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a9de6d-a606-4cad-9153-8891c6a322a2-logs\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.289150 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48a9de6d-a606-4cad-9153-8891c6a322a2-logs\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.292455 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.294938 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-config-data\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.306297 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a9de6d-a606-4cad-9153-8891c6a322a2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.308522 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fkml\" (UniqueName: \"kubernetes.io/projected/48a9de6d-a606-4cad-9153-8891c6a322a2-kube-api-access-5fkml\") pod \"nova-metadata-0\" (UID: \"48a9de6d-a606-4cad-9153-8891c6a322a2\") " pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.501770 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.642321 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 19:52:50 crc kubenswrapper[4553]: W0930 19:52:50.647609 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ec8d773_3769_4cd8_9da4_4696693173fa.slice/crio-743246fd9940a975cbb6e80cb875a3985c02f04709ada24dd371c99d6fcba62f WatchSource:0}: Error finding container 743246fd9940a975cbb6e80cb875a3985c02f04709ada24dd371c99d6fcba62f: Status 404 returned error can't find the container with id 743246fd9940a975cbb6e80cb875a3985c02f04709ada24dd371c99d6fcba62f Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.648848 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 19:52:50 crc kubenswrapper[4553]: I0930 19:52:50.967118 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 19:52:51 crc kubenswrapper[4553]: I0930 19:52:51.088459 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ec8d773-3769-4cd8-9da4-4696693173fa","Type":"ContainerStarted","Data":"1a959b1a11e79ba510e506535a555c6cde39b53bf06233de1cc4d910fa6eb108"} Sep 30 19:52:51 crc kubenswrapper[4553]: I0930 19:52:51.088744 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0ec8d773-3769-4cd8-9da4-4696693173fa","Type":"ContainerStarted","Data":"743246fd9940a975cbb6e80cb875a3985c02f04709ada24dd371c99d6fcba62f"} Sep 30 19:52:51 crc kubenswrapper[4553]: I0930 19:52:51.090489 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ac042a-d1af-49f5-b573-f76fc772dabd","Type":"ContainerStarted","Data":"3fbc9ce2ef6de765d3a9dc3c47f2a2612d41a9de47cb7aebb35d7420ffd6eb71"} Sep 30 19:52:51 crc kubenswrapper[4553]: I0930 19:52:51.090677 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ac042a-d1af-49f5-b573-f76fc772dabd","Type":"ContainerStarted","Data":"06e79ebfb4994902da7da9859ad83c930754af0cc4acecab29ec6d6a46128a83"} Sep 30 19:52:51 crc kubenswrapper[4553]: I0930 19:52:51.096282 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48a9de6d-a606-4cad-9153-8891c6a322a2","Type":"ContainerStarted","Data":"f4200e59bbfd75a7e7b71b8086609a8d69923e9fc901994aec555f8189dde46e"} Sep 30 19:52:51 crc kubenswrapper[4553]: I0930 19:52:51.108321 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.108302405 podStartE2EDuration="2.108302405s" podCreationTimestamp="2025-09-30 19:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:51.103744592 +0000 UTC m=+1224.303246722" watchObservedRunningTime="2025-09-30 19:52:51.108302405 +0000 UTC m=+1224.307804535" Sep 30 19:52:51 crc kubenswrapper[4553]: I0930 19:52:51.524567 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bd77eb-ad04-4611-a1a7-634cc17aa46c" path="/var/lib/kubelet/pods/c3bd77eb-ad04-4611-a1a7-634cc17aa46c/volumes" Sep 30 19:52:52 crc kubenswrapper[4553]: I0930 19:52:52.108750 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2ac042a-d1af-49f5-b573-f76fc772dabd","Type":"ContainerStarted","Data":"20de42ae7a1247a68b2e5153efa9463c9d50a2151a797436cd3cab74cf69fcd6"} Sep 30 19:52:52 crc kubenswrapper[4553]: I0930 19:52:52.111796 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48a9de6d-a606-4cad-9153-8891c6a322a2","Type":"ContainerStarted","Data":"c90f3f220d8720579e1ea443c21ae15d2dd2355f5a8cdfd05bf4d0945a688f74"} Sep 30 19:52:52 crc kubenswrapper[4553]: I0930 19:52:52.111867 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"48a9de6d-a606-4cad-9153-8891c6a322a2","Type":"ContainerStarted","Data":"810e75677d965584ac97dc601675941ae2f04e43be18b796e13017ca629c1ee1"} Sep 30 19:52:52 crc kubenswrapper[4553]: I0930 19:52:52.143721 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.143690898 podStartE2EDuration="3.143690898s" podCreationTimestamp="2025-09-30 19:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:52.130489854 +0000 UTC m=+1225.329992034" watchObservedRunningTime="2025-09-30 19:52:52.143690898 +0000 UTC m=+1225.343193058" Sep 30 19:52:52 crc kubenswrapper[4553]: I0930 19:52:52.163614 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.163591712 podStartE2EDuration="2.163591712s" podCreationTimestamp="2025-09-30 19:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:52:52.155462295 +0000 UTC m=+1225.354964435" watchObservedRunningTime="2025-09-30 19:52:52.163591712 +0000 UTC m=+1225.363093852" Sep 30 19:52:54 crc kubenswrapper[4553]: I0930 19:52:54.459466 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 19:52:55 crc kubenswrapper[4553]: I0930 19:52:55.502598 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:52:55 crc kubenswrapper[4553]: I0930 19:52:55.502899 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 19:52:59 crc kubenswrapper[4553]: I0930 19:52:59.459877 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 19:52:59 crc kubenswrapper[4553]: I0930 19:52:59.498657 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 19:52:59 crc kubenswrapper[4553]: I0930 19:52:59.777330 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:52:59 crc kubenswrapper[4553]: I0930 19:52:59.777406 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 19:53:00 crc kubenswrapper[4553]: I0930 19:53:00.243555 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 19:53:00 crc kubenswrapper[4553]: I0930 19:53:00.502771 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 19:53:00 crc kubenswrapper[4553]: I0930 19:53:00.502843 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 19:53:00 crc kubenswrapper[4553]: I0930 19:53:00.799279 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2ac042a-d1af-49f5-b573-f76fc772dabd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:53:00 crc kubenswrapper[4553]: I0930 19:53:00.799328 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2ac042a-d1af-49f5-b573-f76fc772dabd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:53:01 crc kubenswrapper[4553]: I0930 19:53:01.522170 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="48a9de6d-a606-4cad-9153-8891c6a322a2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 19:53:01 crc kubenswrapper[4553]: I0930 19:53:01.522179 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="48a9de6d-a606-4cad-9153-8891c6a322a2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 19:53:03 crc kubenswrapper[4553]: I0930 19:53:03.528513 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 19:53:09 crc kubenswrapper[4553]: I0930 19:53:09.788898 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 19:53:09 crc kubenswrapper[4553]: I0930 19:53:09.791113 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 19:53:09 crc kubenswrapper[4553]: I0930 19:53:09.791609 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 19:53:09 crc kubenswrapper[4553]: I0930 19:53:09.791686 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 19:53:09 crc kubenswrapper[4553]: I0930 19:53:09.808350 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 19:53:09 crc kubenswrapper[4553]: I0930 19:53:09.810307 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 19:53:10 crc kubenswrapper[4553]: I0930 19:53:10.508942 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 19:53:10 crc kubenswrapper[4553]: I0930 19:53:10.510492 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 19:53:10 crc kubenswrapper[4553]: I0930 19:53:10.520330 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 19:53:11 crc kubenswrapper[4553]: I0930 19:53:11.333209 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 19:53:19 crc kubenswrapper[4553]: I0930 19:53:19.659213 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:53:20 crc kubenswrapper[4553]: I0930 19:53:20.559157 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:53:24 crc kubenswrapper[4553]: I0930 19:53:24.441615 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerName="rabbitmq" containerID="cri-o://dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42" gracePeriod=604796 Sep 30 19:53:25 crc kubenswrapper[4553]: I0930 19:53:25.741481 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerName="rabbitmq" containerID="cri-o://41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8" gracePeriod=604795 Sep 30 19:53:29 crc kubenswrapper[4553]: I0930 19:53:29.272584 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Sep 30 19:53:29 crc kubenswrapper[4553]: I0930 19:53:29.626870 4553 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.093562 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.160792 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-config-data\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161125 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-erlang-cookie\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161172 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-confd\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161226 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxwtv\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-kube-api-access-gxwtv\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161286 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-server-conf\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161330 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c4de23a-3df4-47a2-86f1-436a8b11c22d-pod-info\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161389 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c4de23a-3df4-47a2-86f1-436a8b11c22d-erlang-cookie-secret\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161428 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161476 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-plugins-conf\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161503 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-plugins\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.161599 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-tls\") pod \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\" (UID: \"7c4de23a-3df4-47a2-86f1-436a8b11c22d\") " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.164405 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.165933 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.166624 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.171099 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7c4de23a-3df4-47a2-86f1-436a8b11c22d-pod-info" (OuterVolumeSpecName: "pod-info") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.196910 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.197145 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-kube-api-access-gxwtv" (OuterVolumeSpecName: "kube-api-access-gxwtv") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "kube-api-access-gxwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.199497 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.200243 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c4de23a-3df4-47a2-86f1-436a8b11c22d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.226663 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-config-data" (OuterVolumeSpecName: "config-data") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.263892 4553 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c4de23a-3df4-47a2-86f1-436a8b11c22d-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.263923 4553 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c4de23a-3df4-47a2-86f1-436a8b11c22d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.263951 4553 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.263984 4553 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.263994 4553 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.264002 4553 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.264011 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.264022 4553 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.264031 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxwtv\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-kube-api-access-gxwtv\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.278249 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-server-conf" (OuterVolumeSpecName: "server-conf") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.299936 4553 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.356199 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7c4de23a-3df4-47a2-86f1-436a8b11c22d" (UID: "7c4de23a-3df4-47a2-86f1-436a8b11c22d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.365918 4553 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c4de23a-3df4-47a2-86f1-436a8b11c22d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.365943 4553 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c4de23a-3df4-47a2-86f1-436a8b11c22d-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.365953 4553 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.524230 4553 generic.go:334] "Generic (PLEG): container finished" podID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerID="dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42" exitCode=0 Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.524270 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c4de23a-3df4-47a2-86f1-436a8b11c22d","Type":"ContainerDied","Data":"dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42"} Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.524303 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c4de23a-3df4-47a2-86f1-436a8b11c22d","Type":"ContainerDied","Data":"2417903089be013d11ef24537688194758861d1cdf3a423447473d98b078380a"} Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.524322 4553 scope.go:117] "RemoveContainer" containerID="dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.524322 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.553787 4553 scope.go:117] "RemoveContainer" containerID="d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.571627 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.581489 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.587913 4553 scope.go:117] "RemoveContainer" containerID="dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42" Sep 30 19:53:31 crc kubenswrapper[4553]: E0930 19:53:31.588373 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42\": container with ID starting with dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42 not found: ID does not exist" containerID="dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.588401 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42"} err="failed to get container status \"dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42\": rpc error: code = NotFound desc = could not find container \"dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42\": container with ID starting with dcb3f763aeae8c9f93e921c451f3d0d555b08740e63bf5df3c4a6b6b59f53f42 not found: ID does not exist" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.588419 4553 scope.go:117] "RemoveContainer" containerID="d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85" Sep 30 19:53:31 crc kubenswrapper[4553]: E0930 19:53:31.588893 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85\": container with ID starting with d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85 not found: ID does not exist" containerID="d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.588922 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85"} err="failed to get container status \"d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85\": rpc error: code = NotFound desc = could not find container \"d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85\": container with ID starting with d5f0840ea8aa7f8cfbf4b6f00581c0a035ac04301f1152a3204140e7bb6e4c85 not found: ID does not exist" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.599095 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:53:31 crc kubenswrapper[4553]: E0930 19:53:31.599568 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerName="rabbitmq" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.599588 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerName="rabbitmq" Sep 30 19:53:31 crc kubenswrapper[4553]: E0930 19:53:31.599608 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerName="setup-container" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.599616 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerName="setup-container" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.599824 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" containerName="rabbitmq" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.602825 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.609501 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.609562 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.609584 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.609758 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.612248 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.612385 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z6vj8" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.612487 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.625524 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.673957 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674011 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674049 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec990c27-2305-4611-83a7-1849ba2dffa9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674067 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674083 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674151 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-config-data\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674170 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674187 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec990c27-2305-4611-83a7-1849ba2dffa9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674208 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674251 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqwn\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-kube-api-access-kgqwn\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.674268 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.774857 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.775681 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec990c27-2305-4611-83a7-1849ba2dffa9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.775632 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.775761 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.775780 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.776056 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.776503 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-config-data\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.776531 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.776551 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec990c27-2305-4611-83a7-1849ba2dffa9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.776575 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.776624 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqwn\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-kube-api-access-kgqwn\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.776644 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.776667 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.777679 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.778027 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.778466 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.778846 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec990c27-2305-4611-83a7-1849ba2dffa9-config-data\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.782313 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec990c27-2305-4611-83a7-1849ba2dffa9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.782748 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec990c27-2305-4611-83a7-1849ba2dffa9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.783116 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.783120 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.805988 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqwn\" (UniqueName: \"kubernetes.io/projected/ec990c27-2305-4611-83a7-1849ba2dffa9-kube-api-access-kgqwn\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.833315 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-server-0\" (UID: \"ec990c27-2305-4611-83a7-1849ba2dffa9\") " pod="openstack/rabbitmq-server-0" Sep 30 19:53:31 crc kubenswrapper[4553]: I0930 19:53:31.924484 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.454081 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.489454 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-config-data\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.489741 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxlh8\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-kube-api-access-wxlh8\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.489797 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-plugins\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.489823 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-server-conf\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.489845 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-plugins-conf\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.489873 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-tls\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.489891 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5bde6e85-a37e-4cec-a759-b0cd4eea2807-erlang-cookie-secret\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.489997 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-confd\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.490025 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5bde6e85-a37e-4cec-a759-b0cd4eea2807-pod-info\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.490078 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.490106 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-erlang-cookie\") pod \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\" (UID: \"5bde6e85-a37e-4cec-a759-b0cd4eea2807\") " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.492600 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.493888 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.499137 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-kube-api-access-wxlh8" (OuterVolumeSpecName: "kube-api-access-wxlh8") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "kube-api-access-wxlh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.502101 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.509121 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5bde6e85-a37e-4cec-a759-b0cd4eea2807-pod-info" (OuterVolumeSpecName: "pod-info") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.511196 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.525579 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bde6e85-a37e-4cec-a759-b0cd4eea2807-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.538974 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.548165 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-config-data" (OuterVolumeSpecName: "config-data") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.567288 4553 generic.go:334] "Generic (PLEG): container finished" podID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerID="41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8" exitCode=0 Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.567327 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bde6e85-a37e-4cec-a759-b0cd4eea2807","Type":"ContainerDied","Data":"41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8"} Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.567354 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5bde6e85-a37e-4cec-a759-b0cd4eea2807","Type":"ContainerDied","Data":"351e46539d24d282f706b3cdbf5f9b471b552a89e7740a87a6d0e65ccb82fd01"} Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.567370 4553 scope.go:117] "RemoveContainer" containerID="41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.567480 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591804 4553 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5bde6e85-a37e-4cec-a759-b0cd4eea2807-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591844 4553 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591854 4553 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591868 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591877 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxlh8\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-kube-api-access-wxlh8\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591885 4553 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591895 4553 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591902 4553 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.591910 4553 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5bde6e85-a37e-4cec-a759-b0cd4eea2807-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.622645 4553 scope.go:117] "RemoveContainer" containerID="56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.624739 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-server-conf" (OuterVolumeSpecName: "server-conf") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.650183 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.663992 4553 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.691900 4553 scope.go:117] "RemoveContainer" containerID="41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.695484 4553 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.695515 4553 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5bde6e85-a37e-4cec-a759-b0cd4eea2807-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: E0930 19:53:32.699307 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8\": container with ID starting with 41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8 not found: ID does not exist" containerID="41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.699569 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8"} err="failed to get container status \"41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8\": rpc error: code = NotFound desc = could not find container \"41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8\": container with ID starting with 41dda8cfc13fb7bcb44b99ca2a255b04724c4b21275fdf8333f6b3e5af24fec8 not found: ID does not exist" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.699668 4553 scope.go:117] "RemoveContainer" containerID="56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484" Sep 30 19:53:32 crc kubenswrapper[4553]: E0930 19:53:32.700241 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484\": container with ID starting with 56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484 not found: ID does not exist" containerID="56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.700343 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484"} err="failed to get container status \"56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484\": rpc error: code = NotFound desc = could not find container \"56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484\": container with ID starting with 56d474ee9d05db649aeef6acfb381ffed38ae8760337766c87b2088290b1b484 not found: ID does not exist" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.847369 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5bde6e85-a37e-4cec-a759-b0cd4eea2807" (UID: "5bde6e85-a37e-4cec-a759-b0cd4eea2807"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.898385 4553 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5bde6e85-a37e-4cec-a759-b0cd4eea2807-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.928953 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.937206 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.956458 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:53:32 crc kubenswrapper[4553]: E0930 19:53:32.956812 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerName="rabbitmq" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.956827 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerName="rabbitmq" Sep 30 19:53:32 crc kubenswrapper[4553]: E0930 19:53:32.956853 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerName="setup-container" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.956860 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerName="setup-container" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.957046 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" containerName="rabbitmq" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.957941 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.965356 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.965425 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.965527 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.965692 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-phdzt" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.965699 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.965828 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.965876 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 19:53:32 crc kubenswrapper[4553]: I0930 19:53:32.988646 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.101768 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6cd\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-kube-api-access-2g6cd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.101847 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.101874 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.101899 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.102197 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.102317 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae9d46b0-115e-4584-8fd4-b89e0f635103-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.102425 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae9d46b0-115e-4584-8fd4-b89e0f635103-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.102449 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.102469 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.102547 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.102755 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.203890 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.203935 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.203974 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6cd\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-kube-api-access-2g6cd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.204030 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.204073 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.204098 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.204232 4553 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.204389 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.204922 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.205193 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.205270 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae9d46b0-115e-4584-8fd4-b89e0f635103-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.205325 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.205351 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae9d46b0-115e-4584-8fd4-b89e0f635103-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.205371 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.205395 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.205558 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae9d46b0-115e-4584-8fd4-b89e0f635103-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.205805 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.208702 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.208801 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae9d46b0-115e-4584-8fd4-b89e0f635103-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.209440 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.210800 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae9d46b0-115e-4584-8fd4-b89e0f635103-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.235316 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6cd\" (UniqueName: \"kubernetes.io/projected/ae9d46b0-115e-4584-8fd4-b89e0f635103-kube-api-access-2g6cd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.235473 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae9d46b0-115e-4584-8fd4-b89e0f635103\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.276569 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.515146 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bde6e85-a37e-4cec-a759-b0cd4eea2807" path="/var/lib/kubelet/pods/5bde6e85-a37e-4cec-a759-b0cd4eea2807/volumes" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.517301 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c4de23a-3df4-47a2-86f1-436a8b11c22d" path="/var/lib/kubelet/pods/7c4de23a-3df4-47a2-86f1-436a8b11c22d/volumes" Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.575501 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ec990c27-2305-4611-83a7-1849ba2dffa9","Type":"ContainerStarted","Data":"6fb5e01fd1a593e36750532b20f825d9343226506900fce95b662b73f16b1b4c"} Sep 30 19:53:33 crc kubenswrapper[4553]: I0930 19:53:33.716610 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 19:53:33 crc kubenswrapper[4553]: W0930 19:53:33.726108 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae9d46b0_115e_4584_8fd4_b89e0f635103.slice/crio-161d51ed66e40971a13544f89caf89fdc9252980ff382ff732ca565d810aec2c WatchSource:0}: Error finding container 161d51ed66e40971a13544f89caf89fdc9252980ff382ff732ca565d810aec2c: Status 404 returned error can't find the container with id 161d51ed66e40971a13544f89caf89fdc9252980ff382ff732ca565d810aec2c Sep 30 19:53:34 crc kubenswrapper[4553]: I0930 19:53:34.591301 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ae9d46b0-115e-4584-8fd4-b89e0f635103","Type":"ContainerStarted","Data":"161d51ed66e40971a13544f89caf89fdc9252980ff382ff732ca565d810aec2c"} Sep 30 19:53:34 crc kubenswrapper[4553]: I0930 19:53:34.594446 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ec990c27-2305-4611-83a7-1849ba2dffa9","Type":"ContainerStarted","Data":"ca496bec1fc3d50600b04d527a8c06c90bcce4088f0abcb19d587f0f01c3cd0f"} Sep 30 19:53:36 crc kubenswrapper[4553]: I0930 19:53:36.615134 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ae9d46b0-115e-4584-8fd4-b89e0f635103","Type":"ContainerStarted","Data":"e8e511ec6d05f0d2f924304b5ba417cd256c3600ec7ed8352bab315c549c5c1a"} Sep 30 19:53:57 crc kubenswrapper[4553]: I0930 19:53:57.840653 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4hfkh/must-gather-jw59q"] Sep 30 19:53:57 crc kubenswrapper[4553]: I0930 19:53:57.843110 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:53:57 crc kubenswrapper[4553]: I0930 19:53:57.848513 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4hfkh"/"openshift-service-ca.crt" Sep 30 19:53:57 crc kubenswrapper[4553]: I0930 19:53:57.851517 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4hfkh"/"kube-root-ca.crt" Sep 30 19:53:57 crc kubenswrapper[4553]: I0930 19:53:57.967433 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4hfkh/must-gather-jw59q"] Sep 30 19:53:57 crc kubenswrapper[4553]: I0930 19:53:57.972204 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5f02ec77-88fd-40c9-8b0f-085d34da84f7-must-gather-output\") pod \"must-gather-jw59q\" (UID: \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\") " pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:53:57 crc kubenswrapper[4553]: I0930 19:53:57.972297 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbmvk\" (UniqueName: \"kubernetes.io/projected/5f02ec77-88fd-40c9-8b0f-085d34da84f7-kube-api-access-dbmvk\") pod \"must-gather-jw59q\" (UID: \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\") " pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:53:58 crc kubenswrapper[4553]: I0930 19:53:58.074168 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5f02ec77-88fd-40c9-8b0f-085d34da84f7-must-gather-output\") pod \"must-gather-jw59q\" (UID: \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\") " pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:53:58 crc kubenswrapper[4553]: I0930 19:53:58.074514 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbmvk\" (UniqueName: \"kubernetes.io/projected/5f02ec77-88fd-40c9-8b0f-085d34da84f7-kube-api-access-dbmvk\") pod \"must-gather-jw59q\" (UID: \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\") " pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:53:58 crc kubenswrapper[4553]: I0930 19:53:58.074741 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5f02ec77-88fd-40c9-8b0f-085d34da84f7-must-gather-output\") pod \"must-gather-jw59q\" (UID: \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\") " pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:53:58 crc kubenswrapper[4553]: I0930 19:53:58.096845 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbmvk\" (UniqueName: \"kubernetes.io/projected/5f02ec77-88fd-40c9-8b0f-085d34da84f7-kube-api-access-dbmvk\") pod \"must-gather-jw59q\" (UID: \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\") " pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:53:58 crc kubenswrapper[4553]: I0930 19:53:58.177496 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:53:58 crc kubenswrapper[4553]: W0930 19:53:58.710172 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f02ec77_88fd_40c9_8b0f_085d34da84f7.slice/crio-6489af7520355b69f830d8fa05d7f7cd88c722a474c4535aae9c8c0c519e7c82 WatchSource:0}: Error finding container 6489af7520355b69f830d8fa05d7f7cd88c722a474c4535aae9c8c0c519e7c82: Status 404 returned error can't find the container with id 6489af7520355b69f830d8fa05d7f7cd88c722a474c4535aae9c8c0c519e7c82 Sep 30 19:53:58 crc kubenswrapper[4553]: I0930 19:53:58.710902 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4hfkh/must-gather-jw59q"] Sep 30 19:53:58 crc kubenswrapper[4553]: I0930 19:53:58.879705 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/must-gather-jw59q" event={"ID":"5f02ec77-88fd-40c9-8b0f-085d34da84f7","Type":"ContainerStarted","Data":"6489af7520355b69f830d8fa05d7f7cd88c722a474c4535aae9c8c0c519e7c82"} Sep 30 19:53:59 crc kubenswrapper[4553]: I0930 19:53:59.584653 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:53:59 crc kubenswrapper[4553]: I0930 19:53:59.584724 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:54:03 crc kubenswrapper[4553]: I0930 19:54:03.920915 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/must-gather-jw59q" event={"ID":"5f02ec77-88fd-40c9-8b0f-085d34da84f7","Type":"ContainerStarted","Data":"ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e"} Sep 30 19:54:03 crc kubenswrapper[4553]: I0930 19:54:03.921485 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/must-gather-jw59q" event={"ID":"5f02ec77-88fd-40c9-8b0f-085d34da84f7","Type":"ContainerStarted","Data":"32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1"} Sep 30 19:54:03 crc kubenswrapper[4553]: I0930 19:54:03.945246 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4hfkh/must-gather-jw59q" podStartSLOduration=2.597692523 podStartE2EDuration="6.945216746s" podCreationTimestamp="2025-09-30 19:53:57 +0000 UTC" firstStartedPulling="2025-09-30 19:53:58.71254261 +0000 UTC m=+1291.912044740" lastFinishedPulling="2025-09-30 19:54:03.060066833 +0000 UTC m=+1296.259568963" observedRunningTime="2025-09-30 19:54:03.934214811 +0000 UTC m=+1297.133716931" watchObservedRunningTime="2025-09-30 19:54:03.945216746 +0000 UTC m=+1297.144718916" Sep 30 19:54:06 crc kubenswrapper[4553]: I0930 19:54:06.951202 4553 generic.go:334] "Generic (PLEG): container finished" podID="ec990c27-2305-4611-83a7-1849ba2dffa9" containerID="ca496bec1fc3d50600b04d527a8c06c90bcce4088f0abcb19d587f0f01c3cd0f" exitCode=0 Sep 30 19:54:06 crc kubenswrapper[4553]: I0930 19:54:06.951285 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ec990c27-2305-4611-83a7-1849ba2dffa9","Type":"ContainerDied","Data":"ca496bec1fc3d50600b04d527a8c06c90bcce4088f0abcb19d587f0f01c3cd0f"} Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.227012 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-6s5kj"] Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.228533 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.231333 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4hfkh"/"default-dockercfg-rxxz9" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.281918 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-host\") pod \"crc-debug-6s5kj\" (UID: \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\") " pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.281959 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zncf2\" (UniqueName: \"kubernetes.io/projected/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-kube-api-access-zncf2\") pod \"crc-debug-6s5kj\" (UID: \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\") " pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.383681 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-host\") pod \"crc-debug-6s5kj\" (UID: \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\") " pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.383731 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zncf2\" (UniqueName: \"kubernetes.io/projected/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-kube-api-access-zncf2\") pod \"crc-debug-6s5kj\" (UID: \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\") " pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.383812 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-host\") pod \"crc-debug-6s5kj\" (UID: \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\") " pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.399684 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zncf2\" (UniqueName: \"kubernetes.io/projected/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-kube-api-access-zncf2\") pod \"crc-debug-6s5kj\" (UID: \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\") " pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.548622 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:54:07 crc kubenswrapper[4553]: W0930 19:54:07.592402 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef27c2ea_1df1_46cc_82a4_eb78cd73c09c.slice/crio-dbfd9bdd3136d4d5ffbf256da335e99dc2d98ee75078de1014d6a33fc7652982 WatchSource:0}: Error finding container dbfd9bdd3136d4d5ffbf256da335e99dc2d98ee75078de1014d6a33fc7652982: Status 404 returned error can't find the container with id dbfd9bdd3136d4d5ffbf256da335e99dc2d98ee75078de1014d6a33fc7652982 Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.964217 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ec990c27-2305-4611-83a7-1849ba2dffa9","Type":"ContainerStarted","Data":"ea998b31445e97d74a57058538201046cd5354b82caac86188fbc390c8a2346e"} Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.965647 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 19:54:07 crc kubenswrapper[4553]: I0930 19:54:07.971382 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" event={"ID":"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c","Type":"ContainerStarted","Data":"dbfd9bdd3136d4d5ffbf256da335e99dc2d98ee75078de1014d6a33fc7652982"} Sep 30 19:54:08 crc kubenswrapper[4553]: I0930 19:54:08.010565 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.010549116 podStartE2EDuration="37.010549116s" podCreationTimestamp="2025-09-30 19:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:54:08.004383531 +0000 UTC m=+1301.203885671" watchObservedRunningTime="2025-09-30 19:54:08.010549116 +0000 UTC m=+1301.210051246" Sep 30 19:54:08 crc kubenswrapper[4553]: I0930 19:54:08.981021 4553 generic.go:334] "Generic (PLEG): container finished" podID="ae9d46b0-115e-4584-8fd4-b89e0f635103" containerID="e8e511ec6d05f0d2f924304b5ba417cd256c3600ec7ed8352bab315c549c5c1a" exitCode=0 Sep 30 19:54:08 crc kubenswrapper[4553]: I0930 19:54:08.981086 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ae9d46b0-115e-4584-8fd4-b89e0f635103","Type":"ContainerDied","Data":"e8e511ec6d05f0d2f924304b5ba417cd256c3600ec7ed8352bab315c549c5c1a"} Sep 30 19:54:09 crc kubenswrapper[4553]: I0930 19:54:09.995566 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ae9d46b0-115e-4584-8fd4-b89e0f635103","Type":"ContainerStarted","Data":"9795a50f128895093c31d1b2b9db475e314d8191bbfce66a340f8d85e6113a90"} Sep 30 19:54:09 crc kubenswrapper[4553]: I0930 19:54:09.998196 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:54:21 crc kubenswrapper[4553]: I0930 19:54:21.928243 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 19:54:21 crc kubenswrapper[4553]: I0930 19:54:21.957478 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.957459232 podStartE2EDuration="49.957459232s" podCreationTimestamp="2025-09-30 19:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 19:54:10.028472896 +0000 UTC m=+1303.227975036" watchObservedRunningTime="2025-09-30 19:54:21.957459232 +0000 UTC m=+1315.156961362" Sep 30 19:54:22 crc kubenswrapper[4553]: I0930 19:54:22.096788 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" event={"ID":"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c","Type":"ContainerStarted","Data":"1a303d4e409c7102b6ffe0a4536001600b5ac7de16d9a236d1c6ac79e5f05394"} Sep 30 19:54:23 crc kubenswrapper[4553]: I0930 19:54:23.279271 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 19:54:23 crc kubenswrapper[4553]: I0930 19:54:23.304311 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" podStartSLOduration=2.156926109 podStartE2EDuration="16.304282864s" podCreationTimestamp="2025-09-30 19:54:07 +0000 UTC" firstStartedPulling="2025-09-30 19:54:07.594009509 +0000 UTC m=+1300.793511639" lastFinishedPulling="2025-09-30 19:54:21.741366264 +0000 UTC m=+1314.940868394" observedRunningTime="2025-09-30 19:54:22.114685561 +0000 UTC m=+1315.314187691" watchObservedRunningTime="2025-09-30 19:54:23.304282864 +0000 UTC m=+1316.503784994" Sep 30 19:54:29 crc kubenswrapper[4553]: I0930 19:54:29.585292 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:54:29 crc kubenswrapper[4553]: I0930 19:54:29.585784 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:54:43 crc kubenswrapper[4553]: I0930 19:54:43.875613 4553 scope.go:117] "RemoveContainer" containerID="27e0051eae90e7b155bbe35e06f94d7ded00aba28ff0842abbccb5bc759ecafe" Sep 30 19:54:59 crc kubenswrapper[4553]: I0930 19:54:59.584681 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:54:59 crc kubenswrapper[4553]: I0930 19:54:59.585295 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:54:59 crc kubenswrapper[4553]: I0930 19:54:59.585336 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:54:59 crc kubenswrapper[4553]: I0930 19:54:59.585985 4553 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7864ee52b427b57981d569d4ee7a9292f56eb6909fb29d851a6775585474b37"} pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:54:59 crc kubenswrapper[4553]: I0930 19:54:59.586031 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" containerID="cri-o://c7864ee52b427b57981d569d4ee7a9292f56eb6909fb29d851a6775585474b37" gracePeriod=600 Sep 30 19:55:00 crc kubenswrapper[4553]: I0930 19:55:00.419184 4553 generic.go:334] "Generic (PLEG): container finished" podID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerID="c7864ee52b427b57981d569d4ee7a9292f56eb6909fb29d851a6775585474b37" exitCode=0 Sep 30 19:55:00 crc kubenswrapper[4553]: I0930 19:55:00.419288 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerDied","Data":"c7864ee52b427b57981d569d4ee7a9292f56eb6909fb29d851a6775585474b37"} Sep 30 19:55:00 crc kubenswrapper[4553]: I0930 19:55:00.420145 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerStarted","Data":"4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29"} Sep 30 19:55:00 crc kubenswrapper[4553]: I0930 19:55:00.420175 4553 scope.go:117] "RemoveContainer" containerID="a59ed9a27838f8357f3a7a080d587703e9b1aa4272b3bbad7477f76d8c23eba2" Sep 30 19:55:10 crc kubenswrapper[4553]: I0930 19:55:10.572488 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5c89dc44dd-6ghsx_964c839f-1077-4e46-bbb9-f25807d73ed7/barbican-api/0.log" Sep 30 19:55:10 crc kubenswrapper[4553]: I0930 19:55:10.592928 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5c89dc44dd-6ghsx_964c839f-1077-4e46-bbb9-f25807d73ed7/barbican-api-log/0.log" Sep 30 19:55:10 crc kubenswrapper[4553]: I0930 19:55:10.780410 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-6q88m_d5bd8102-b39f-40ee-b03d-9912adca9e41/mariadb-database-create/0.log" Sep 30 19:55:11 crc kubenswrapper[4553]: I0930 19:55:11.304490 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-k52t8_c9958ea9-408e-4b14-8b23-dd1662654cd1/barbican-db-sync/0.log" Sep 30 19:55:11 crc kubenswrapper[4553]: I0930 19:55:11.491368 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-e6e0-account-create-g2wlx_b2f1d2b7-12bf-4d45-b80e-712e015a61e5/mariadb-account-create/0.log" Sep 30 19:55:11 crc kubenswrapper[4553]: I0930 19:55:11.618160 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f4f58b95d-2zhxk_d39e00ce-6b28-4add-98a2-e4330753f27e/barbican-keystone-listener/0.log" Sep 30 19:55:11 crc kubenswrapper[4553]: I0930 19:55:11.772351 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f4f58b95d-2zhxk_d39e00ce-6b28-4add-98a2-e4330753f27e/barbican-keystone-listener-log/0.log" Sep 30 19:55:11 crc kubenswrapper[4553]: I0930 19:55:11.905618 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56db9cccd9-gb669_38816918-17bb-4279-8b49-b9d696171461/barbican-worker/0.log" Sep 30 19:55:12 crc kubenswrapper[4553]: I0930 19:55:12.034311 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56db9cccd9-gb669_38816918-17bb-4279-8b49-b9d696171461/barbican-worker-log/0.log" Sep 30 19:55:12 crc kubenswrapper[4553]: I0930 19:55:12.185915 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b964830e-4e1f-449d-bee4-fa7ed59b7ffc/ceilometer-central-agent/0.log" Sep 30 19:55:12 crc kubenswrapper[4553]: I0930 19:55:12.334932 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b964830e-4e1f-449d-bee4-fa7ed59b7ffc/ceilometer-notification-agent/0.log" Sep 30 19:55:12 crc kubenswrapper[4553]: I0930 19:55:12.365482 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b964830e-4e1f-449d-bee4-fa7ed59b7ffc/proxy-httpd/0.log" Sep 30 19:55:12 crc kubenswrapper[4553]: I0930 19:55:12.387736 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b964830e-4e1f-449d-bee4-fa7ed59b7ffc/sg-core/0.log" Sep 30 19:55:12 crc kubenswrapper[4553]: I0930 19:55:12.677813 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e2eb0f0-7643-448e-a97e-bd6551fe128e/cinder-api/0.log" Sep 30 19:55:12 crc kubenswrapper[4553]: I0930 19:55:12.677894 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4e2eb0f0-7643-448e-a97e-bd6551fe128e/cinder-api-log/0.log" Sep 30 19:55:12 crc kubenswrapper[4553]: I0930 19:55:12.874981 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-qg8lx_d8f3b7b5-90a0-44bc-9ba4-40729ffe3000/mariadb-database-create/0.log" Sep 30 19:55:13 crc kubenswrapper[4553]: I0930 19:55:13.153134 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-prf67_04f1abd5-5975-4038-98b3-4b6ff0e858f7/cinder-db-sync/0.log" Sep 30 19:55:13 crc kubenswrapper[4553]: I0930 19:55:13.188887 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-f77f-account-create-zqps4_ecc06475-3808-4387-8427-c82f3f77ba73/mariadb-account-create/0.log" Sep 30 19:55:13 crc kubenswrapper[4553]: I0930 19:55:13.366028 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6bc11f5f-706a-4984-ac99-d28ee3c2f8b5/cinder-scheduler/0.log" Sep 30 19:55:13 crc kubenswrapper[4553]: I0930 19:55:13.439555 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_6bc11f5f-706a-4984-ac99-d28ee3c2f8b5/probe/0.log" Sep 30 19:55:13 crc kubenswrapper[4553]: I0930 19:55:13.700303 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-xhbcp_7a1c1e1d-56a5-4748-8ca3-e210541bbcbe/init/0.log" Sep 30 19:55:13 crc kubenswrapper[4553]: I0930 19:55:13.860490 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-xhbcp_7a1c1e1d-56a5-4748-8ca3-e210541bbcbe/init/0.log" Sep 30 19:55:13 crc kubenswrapper[4553]: I0930 19:55:13.917220 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-xhbcp_7a1c1e1d-56a5-4748-8ca3-e210541bbcbe/dnsmasq-dns/0.log" Sep 30 19:55:13 crc kubenswrapper[4553]: I0930 19:55:13.940424 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-446a-account-create-rdrh4_657cedd1-5a4e-4219-977b-92da68039989/mariadb-account-create/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.147327 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-gw5ch_3f9a8e95-e61a-473d-a74f-cf7a6820ff97/glance-db-sync/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.154897 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-8qx8h_7810c768-948a-47c9-99e0-4b9c5c38f7ba/mariadb-database-create/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.353985 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4f78bdf3-c263-41a0-9594-85b8c5b0dcd0/glance-httpd/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.368949 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4f78bdf3-c263-41a0-9594-85b8c5b0dcd0/glance-log/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.548138 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b7456ce4-1ac9-4e11-9fc2-f8680cacd86a/glance-log/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.584717 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b7456ce4-1ac9-4e11-9fc2-f8680cacd86a/glance-httpd/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.735309 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-868c6b469d-rhw7t_849f4ec8-2741-4c83-82d8-135a24b43447/horizon/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.844908 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-868c6b469d-rhw7t_849f4ec8-2741-4c83-82d8-135a24b43447/horizon-log/0.log" Sep 30 19:55:14 crc kubenswrapper[4553]: I0930 19:55:14.942214 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-578c97db4-g464k_fa10ffa7-097c-4ecf-bc8b-b850f29ce9f5/keystone-api/0.log" Sep 30 19:55:15 crc kubenswrapper[4553]: I0930 19:55:15.074161 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-pk229_2633e01b-c518-4077-af93-7ba213150186/keystone-bootstrap/0.log" Sep 30 19:55:15 crc kubenswrapper[4553]: I0930 19:55:15.209786 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-vhp2l_cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8/mariadb-database-create/0.log" Sep 30 19:55:15 crc kubenswrapper[4553]: I0930 19:55:15.265341 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-hsmcl_3f158e70-9924-417e-b100-983f574bef9a/keystone-db-sync/0.log" Sep 30 19:55:15 crc kubenswrapper[4553]: I0930 19:55:15.442453 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-e21a-account-create-tqjjr_8dac5b01-0adc-4d37-9dcb-707537a02cf0/mariadb-account-create/0.log" Sep 30 19:55:15 crc kubenswrapper[4553]: I0930 19:55:15.600867 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6093f7c3-b483-4b10-89c1-ea1a4c118ca7/kube-state-metrics/0.log" Sep 30 19:55:15 crc kubenswrapper[4553]: I0930 19:55:15.891822 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69bfb64645-4wbwh_b140e797-51c4-4f37-9062-a604eef8c280/neutron-api/0.log" Sep 30 19:55:16 crc kubenswrapper[4553]: I0930 19:55:16.024768 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69bfb64645-4wbwh_b140e797-51c4-4f37-9062-a604eef8c280/neutron-httpd/0.log" Sep 30 19:55:16 crc kubenswrapper[4553]: I0930 19:55:16.201937 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-981f-account-create-68lnd_ccd42fc2-3f65-4d1f-a6bc-4c564f653f90/mariadb-account-create/0.log" Sep 30 19:55:16 crc kubenswrapper[4553]: I0930 19:55:16.389726 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-nvx5r_b0be7cfb-07fc-426f-a177-4199643cff46/mariadb-database-create/0.log" Sep 30 19:55:16 crc kubenswrapper[4553]: I0930 19:55:16.650422 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-f7rgm_d1bf2fc0-8737-4258-9bf8-1978001043f9/neutron-db-sync/0.log" Sep 30 19:55:16 crc kubenswrapper[4553]: I0930 19:55:16.867241 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d2ac042a-d1af-49f5-b573-f76fc772dabd/nova-api-api/0.log" Sep 30 19:55:17 crc kubenswrapper[4553]: I0930 19:55:17.035622 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d2ac042a-d1af-49f5-b573-f76fc772dabd/nova-api-log/0.log" Sep 30 19:55:17 crc kubenswrapper[4553]: I0930 19:55:17.159109 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-6d09-account-create-rf9p6_31b3010c-e679-4828-b02a-7c89c82d6f17/mariadb-account-create/0.log" Sep 30 19:55:17 crc kubenswrapper[4553]: I0930 19:55:17.330609 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-xm47s_b972cf08-0eee-4970-8825-a313fdddc23a/mariadb-database-create/0.log" Sep 30 19:55:17 crc kubenswrapper[4553]: I0930 19:55:17.577441 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-8fd0-account-create-4j249_230810df-34fe-4a09-bf1a-ab53ba9faef4/mariadb-account-create/0.log" Sep 30 19:55:17 crc kubenswrapper[4553]: I0930 19:55:17.816834 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-ghdd6_321c9e7b-0cfd-440b-a1c9-664990e119c5/nova-manage/0.log" Sep 30 19:55:18 crc kubenswrapper[4553]: I0930 19:55:18.081452 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_df4bf445-73fe-492e-9104-f1a0879510d4/nova-cell0-conductor-conductor/0.log" Sep 30 19:55:18 crc kubenswrapper[4553]: I0930 19:55:18.226693 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-ndtgv_6616a935-12f1-4f60-a206-1dbcfd9a6400/nova-cell0-conductor-db-sync/0.log" Sep 30 19:55:18 crc kubenswrapper[4553]: I0930 19:55:18.422881 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-f7n8l_78299d5b-ca49-4bfa-a23e-c81671ab07da/mariadb-database-create/0.log" Sep 30 19:55:18 crc kubenswrapper[4553]: I0930 19:55:18.660689 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-sqkv9_a4567a10-e62e-4839-b8d8-ca0207fe6341/nova-manage/0.log" Sep 30 19:55:18 crc kubenswrapper[4553]: I0930 19:55:18.962588 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_12a2252d-7bc1-4a07-ae99-6fbfa13df27f/nova-cell1-conductor-conductor/0.log" Sep 30 19:55:19 crc kubenswrapper[4553]: I0930 19:55:19.091540 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-xf5kh_1722205d-27ba-4709-bca4-744114e7f16f/nova-cell1-conductor-db-sync/0.log" Sep 30 19:55:19 crc kubenswrapper[4553]: I0930 19:55:19.357317 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-5p4j4_b60eae4e-80e4-4f1d-b7d9-7b498649fa67/mariadb-database-create/0.log" Sep 30 19:55:19 crc kubenswrapper[4553]: I0930 19:55:19.562397 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-eee0-account-create-x9tl2_82733d90-45f9-482e-a453-3b52a14b064e/mariadb-account-create/0.log" Sep 30 19:55:19 crc kubenswrapper[4553]: I0930 19:55:19.771907 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_105050db-f2fe-48cc-ac77-649e4f2f2a83/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 19:55:20 crc kubenswrapper[4553]: I0930 19:55:20.072543 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_48a9de6d-a606-4cad-9153-8891c6a322a2/nova-metadata-log/0.log" Sep 30 19:55:20 crc kubenswrapper[4553]: I0930 19:55:20.153064 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_48a9de6d-a606-4cad-9153-8891c6a322a2/nova-metadata-metadata/0.log" Sep 30 19:55:20 crc kubenswrapper[4553]: I0930 19:55:20.476480 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a92d686d-50a7-44ab-80e0-5e5ee452045c/mysql-bootstrap/0.log" Sep 30 19:55:20 crc kubenswrapper[4553]: I0930 19:55:20.615485 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_0ec8d773-3769-4cd8-9da4-4696693173fa/nova-scheduler-scheduler/0.log" Sep 30 19:55:20 crc kubenswrapper[4553]: I0930 19:55:20.794420 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a92d686d-50a7-44ab-80e0-5e5ee452045c/mysql-bootstrap/0.log" Sep 30 19:55:20 crc kubenswrapper[4553]: I0930 19:55:20.817336 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a92d686d-50a7-44ab-80e0-5e5ee452045c/galera/0.log" Sep 30 19:55:21 crc kubenswrapper[4553]: I0930 19:55:21.088139 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d0712b30-32a7-4e50-b263-c4b3d92b6f0e/mysql-bootstrap/0.log" Sep 30 19:55:21 crc kubenswrapper[4553]: I0930 19:55:21.339364 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d0712b30-32a7-4e50-b263-c4b3d92b6f0e/galera/0.log" Sep 30 19:55:21 crc kubenswrapper[4553]: I0930 19:55:21.361201 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d0712b30-32a7-4e50-b263-c4b3d92b6f0e/mysql-bootstrap/0.log" Sep 30 19:55:21 crc kubenswrapper[4553]: I0930 19:55:21.710858 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b81cd96a-9a9f-4334-8512-34fb38da918f/openstackclient/0.log" Sep 30 19:55:21 crc kubenswrapper[4553]: I0930 19:55:21.724633 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zj5kg_e34b6b2d-694a-48f1-a7a1-3fd03f868af6/openstack-network-exporter/0.log" Sep 30 19:55:21 crc kubenswrapper[4553]: I0930 19:55:21.985418 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zwpmt_e06ee589-214b-45a3-ab70-b71c4dfba2f9/ovsdb-server-init/0.log" Sep 30 19:55:22 crc kubenswrapper[4553]: I0930 19:55:22.241592 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zwpmt_e06ee589-214b-45a3-ab70-b71c4dfba2f9/ovsdb-server-init/0.log" Sep 30 19:55:22 crc kubenswrapper[4553]: I0930 19:55:22.295302 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zwpmt_e06ee589-214b-45a3-ab70-b71c4dfba2f9/ovs-vswitchd/0.log" Sep 30 19:55:22 crc kubenswrapper[4553]: I0930 19:55:22.302686 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zwpmt_e06ee589-214b-45a3-ab70-b71c4dfba2f9/ovsdb-server/0.log" Sep 30 19:55:22 crc kubenswrapper[4553]: I0930 19:55:22.615318 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-r4k44_9e6cc85b-124a-415e-a4f1-17219da3165c/ovn-controller/0.log" Sep 30 19:55:22 crc kubenswrapper[4553]: I0930 19:55:22.796510 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_484ae2ff-7e70-4177-85b7-66369d8a7d76/ovn-northd/0.log" Sep 30 19:55:22 crc kubenswrapper[4553]: I0930 19:55:22.849312 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_484ae2ff-7e70-4177-85b7-66369d8a7d76/openstack-network-exporter/0.log" Sep 30 19:55:23 crc kubenswrapper[4553]: I0930 19:55:23.086879 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2ace1318-99e8-4ab2-9244-ed0ca49e89d5/openstack-network-exporter/0.log" Sep 30 19:55:23 crc kubenswrapper[4553]: I0930 19:55:23.213597 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_2ace1318-99e8-4ab2-9244-ed0ca49e89d5/ovsdbserver-nb/0.log" Sep 30 19:55:23 crc kubenswrapper[4553]: I0930 19:55:23.432842 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ee750363-8434-413d-9bc9-fee0218e2e1b/openstack-network-exporter/0.log" Sep 30 19:55:23 crc kubenswrapper[4553]: I0930 19:55:23.569782 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ee750363-8434-413d-9bc9-fee0218e2e1b/ovsdbserver-sb/0.log" Sep 30 19:55:23 crc kubenswrapper[4553]: I0930 19:55:23.713212 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-89c84b54-dlsmt_a4059329-d42b-4d54-b952-feb9f5bd53b6/placement-api/0.log" Sep 30 19:55:23 crc kubenswrapper[4553]: I0930 19:55:23.877385 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-89c84b54-dlsmt_a4059329-d42b-4d54-b952-feb9f5bd53b6/placement-log/0.log" Sep 30 19:55:23 crc kubenswrapper[4553]: I0930 19:55:23.902703 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_879abe38-75bc-4f92-9b0a-52524daadaee/memcached/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.021053 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-a1f4-account-create-vqppj_5c5e64e4-7905-4524-bd33-8ab355eb2c90/mariadb-account-create/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.103991 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-8vprf_f1c23598-c1a7-4544-9204-f071ac589644/mariadb-database-create/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.246620 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-zsslz_08c9fecb-7dc9-4aed-b134-98995f1cf280/placement-db-sync/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.401207 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ae9d46b0-115e-4584-8fd4-b89e0f635103/setup-container/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.620011 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ae9d46b0-115e-4584-8fd4-b89e0f635103/setup-container/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.674651 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ec990c27-2305-4611-83a7-1849ba2dffa9/setup-container/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.676011 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ae9d46b0-115e-4584-8fd4-b89e0f635103/rabbitmq/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.876494 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ec990c27-2305-4611-83a7-1849ba2dffa9/setup-container/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.902556 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ec990c27-2305-4611-83a7-1849ba2dffa9/rabbitmq/0.log" Sep 30 19:55:24 crc kubenswrapper[4553]: I0930 19:55:24.990485 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-74bb65c547-lfcd8_d8354628-b925-4289-9c68-7038dd4b2064/proxy-httpd/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.100295 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-74bb65c547-lfcd8_d8354628-b925-4289-9c68-7038dd4b2064/proxy-server/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.402946 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-bfvdv_1af5c0f2-d0c8-4b67-889b-b677e346c46c/swift-ring-rebalance/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.543311 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/account-auditor/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.581600 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/account-replicator/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.638254 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/account-reaper/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.673481 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/account-server/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.780625 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/container-auditor/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.826924 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/container-replicator/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.901412 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/container-server/0.log" Sep 30 19:55:25 crc kubenswrapper[4553]: I0930 19:55:25.956938 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/container-updater/0.log" Sep 30 19:55:26 crc kubenswrapper[4553]: I0930 19:55:26.007834 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/object-auditor/0.log" Sep 30 19:55:26 crc kubenswrapper[4553]: I0930 19:55:26.046781 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/object-expirer/0.log" Sep 30 19:55:26 crc kubenswrapper[4553]: I0930 19:55:26.093720 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/object-replicator/0.log" Sep 30 19:55:26 crc kubenswrapper[4553]: I0930 19:55:26.164375 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/object-server/0.log" Sep 30 19:55:26 crc kubenswrapper[4553]: I0930 19:55:26.230352 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/object-updater/0.log" Sep 30 19:55:26 crc kubenswrapper[4553]: I0930 19:55:26.291368 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/rsync/0.log" Sep 30 19:55:26 crc kubenswrapper[4553]: I0930 19:55:26.331110 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_0af05a35-cd0b-4875-b263-c8c62ebaa2cc/swift-recon-cron/0.log" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.653399 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bdk4j"] Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.655750 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.716672 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdk4j"] Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.780880 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-utilities\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.780952 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjcm5\" (UniqueName: \"kubernetes.io/projected/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-kube-api-access-cjcm5\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.781158 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-catalog-content\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.883201 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-utilities\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.883275 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjcm5\" (UniqueName: \"kubernetes.io/projected/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-kube-api-access-cjcm5\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.883316 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-catalog-content\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.883765 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-catalog-content\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.883950 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-utilities\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.914869 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjcm5\" (UniqueName: \"kubernetes.io/projected/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-kube-api-access-cjcm5\") pod \"certified-operators-bdk4j\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:37 crc kubenswrapper[4553]: I0930 19:55:37.974272 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:38 crc kubenswrapper[4553]: I0930 19:55:38.678973 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bdk4j"] Sep 30 19:55:38 crc kubenswrapper[4553]: I0930 19:55:38.770310 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdk4j" event={"ID":"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5","Type":"ContainerStarted","Data":"f686427782337af96b7a593dbe24a228372ea36eec70df06e78e2c6b9a11a2c8"} Sep 30 19:55:39 crc kubenswrapper[4553]: I0930 19:55:39.780261 4553 generic.go:334] "Generic (PLEG): container finished" podID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerID="3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0" exitCode=0 Sep 30 19:55:39 crc kubenswrapper[4553]: I0930 19:55:39.780315 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdk4j" event={"ID":"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5","Type":"ContainerDied","Data":"3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0"} Sep 30 19:55:40 crc kubenswrapper[4553]: I0930 19:55:40.794318 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdk4j" event={"ID":"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5","Type":"ContainerStarted","Data":"82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824"} Sep 30 19:55:42 crc kubenswrapper[4553]: I0930 19:55:42.816682 4553 generic.go:334] "Generic (PLEG): container finished" podID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerID="82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824" exitCode=0 Sep 30 19:55:42 crc kubenswrapper[4553]: I0930 19:55:42.818656 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdk4j" event={"ID":"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5","Type":"ContainerDied","Data":"82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824"} Sep 30 19:55:43 crc kubenswrapper[4553]: I0930 19:55:43.833205 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdk4j" event={"ID":"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5","Type":"ContainerStarted","Data":"b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0"} Sep 30 19:55:43 crc kubenswrapper[4553]: I0930 19:55:43.860786 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bdk4j" podStartSLOduration=3.362941683 podStartE2EDuration="6.860764015s" podCreationTimestamp="2025-09-30 19:55:37 +0000 UTC" firstStartedPulling="2025-09-30 19:55:39.781792547 +0000 UTC m=+1392.981294677" lastFinishedPulling="2025-09-30 19:55:43.279614849 +0000 UTC m=+1396.479117009" observedRunningTime="2025-09-30 19:55:43.850287883 +0000 UTC m=+1397.049790043" watchObservedRunningTime="2025-09-30 19:55:43.860764015 +0000 UTC m=+1397.060266155" Sep 30 19:55:43 crc kubenswrapper[4553]: I0930 19:55:43.969891 4553 scope.go:117] "RemoveContainer" containerID="d49ed18512af1979da607775f33ac9e3d773a2c6383fe37bde1a20c1fdb2a781" Sep 30 19:55:47 crc kubenswrapper[4553]: I0930 19:55:47.975516 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:47 crc kubenswrapper[4553]: I0930 19:55:47.976141 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:49 crc kubenswrapper[4553]: I0930 19:55:49.058073 4553 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bdk4j" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="registry-server" probeResult="failure" output=< Sep 30 19:55:49 crc kubenswrapper[4553]: timeout: failed to connect service ":50051" within 1s Sep 30 19:55:49 crc kubenswrapper[4553]: > Sep 30 19:55:53 crc kubenswrapper[4553]: I0930 19:55:53.947330 4553 generic.go:334] "Generic (PLEG): container finished" podID="ef27c2ea-1df1-46cc-82a4-eb78cd73c09c" containerID="1a303d4e409c7102b6ffe0a4536001600b5ac7de16d9a236d1c6ac79e5f05394" exitCode=0 Sep 30 19:55:53 crc kubenswrapper[4553]: I0930 19:55:53.947446 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" event={"ID":"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c","Type":"ContainerDied","Data":"1a303d4e409c7102b6ffe0a4536001600b5ac7de16d9a236d1c6ac79e5f05394"} Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.072613 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.135167 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-host\") pod \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\" (UID: \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\") " Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.135309 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zncf2\" (UniqueName: \"kubernetes.io/projected/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-kube-api-access-zncf2\") pod \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\" (UID: \"ef27c2ea-1df1-46cc-82a4-eb78cd73c09c\") " Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.137933 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-host" (OuterVolumeSpecName: "host") pod "ef27c2ea-1df1-46cc-82a4-eb78cd73c09c" (UID: "ef27c2ea-1df1-46cc-82a4-eb78cd73c09c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.138582 4553 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.150053 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-6s5kj"] Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.152196 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-kube-api-access-zncf2" (OuterVolumeSpecName: "kube-api-access-zncf2") pod "ef27c2ea-1df1-46cc-82a4-eb78cd73c09c" (UID: "ef27c2ea-1df1-46cc-82a4-eb78cd73c09c"). InnerVolumeSpecName "kube-api-access-zncf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.173731 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-6s5kj"] Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.240258 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zncf2\" (UniqueName: \"kubernetes.io/projected/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c-kube-api-access-zncf2\") on node \"crc\" DevicePath \"\"" Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.529657 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef27c2ea-1df1-46cc-82a4-eb78cd73c09c" path="/var/lib/kubelet/pods/ef27c2ea-1df1-46cc-82a4-eb78cd73c09c/volumes" Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.979644 4553 scope.go:117] "RemoveContainer" containerID="1a303d4e409c7102b6ffe0a4536001600b5ac7de16d9a236d1c6ac79e5f05394" Sep 30 19:55:55 crc kubenswrapper[4553]: I0930 19:55:55.979730 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-6s5kj" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.370339 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-k44hw"] Sep 30 19:55:56 crc kubenswrapper[4553]: E0930 19:55:56.370949 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef27c2ea-1df1-46cc-82a4-eb78cd73c09c" containerName="container-00" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.370971 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef27c2ea-1df1-46cc-82a4-eb78cd73c09c" containerName="container-00" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.371355 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef27c2ea-1df1-46cc-82a4-eb78cd73c09c" containerName="container-00" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.372319 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.377249 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4hfkh"/"default-dockercfg-rxxz9" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.466166 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5sl6\" (UniqueName: \"kubernetes.io/projected/2da8044c-98b3-4100-88b8-4a933e16d96a-kube-api-access-t5sl6\") pod \"crc-debug-k44hw\" (UID: \"2da8044c-98b3-4100-88b8-4a933e16d96a\") " pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.466418 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2da8044c-98b3-4100-88b8-4a933e16d96a-host\") pod \"crc-debug-k44hw\" (UID: \"2da8044c-98b3-4100-88b8-4a933e16d96a\") " pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.568145 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5sl6\" (UniqueName: \"kubernetes.io/projected/2da8044c-98b3-4100-88b8-4a933e16d96a-kube-api-access-t5sl6\") pod \"crc-debug-k44hw\" (UID: \"2da8044c-98b3-4100-88b8-4a933e16d96a\") " pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.569177 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2da8044c-98b3-4100-88b8-4a933e16d96a-host\") pod \"crc-debug-k44hw\" (UID: \"2da8044c-98b3-4100-88b8-4a933e16d96a\") " pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.569254 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2da8044c-98b3-4100-88b8-4a933e16d96a-host\") pod \"crc-debug-k44hw\" (UID: \"2da8044c-98b3-4100-88b8-4a933e16d96a\") " pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.605338 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5sl6\" (UniqueName: \"kubernetes.io/projected/2da8044c-98b3-4100-88b8-4a933e16d96a-kube-api-access-t5sl6\") pod \"crc-debug-k44hw\" (UID: \"2da8044c-98b3-4100-88b8-4a933e16d96a\") " pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.707836 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:56 crc kubenswrapper[4553]: I0930 19:55:56.994301 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/crc-debug-k44hw" event={"ID":"2da8044c-98b3-4100-88b8-4a933e16d96a","Type":"ContainerStarted","Data":"d34ac609d1d45d67f338bf0fdba0730be18c6900a17315c4c0942e91e53005b8"} Sep 30 19:55:58 crc kubenswrapper[4553]: I0930 19:55:58.013681 4553 generic.go:334] "Generic (PLEG): container finished" podID="2da8044c-98b3-4100-88b8-4a933e16d96a" containerID="b7e16aa5c095104ead1da275a813fd2e6f9dc131275ad988ef0c696cefe07394" exitCode=0 Sep 30 19:55:58 crc kubenswrapper[4553]: I0930 19:55:58.013732 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/crc-debug-k44hw" event={"ID":"2da8044c-98b3-4100-88b8-4a933e16d96a","Type":"ContainerDied","Data":"b7e16aa5c095104ead1da275a813fd2e6f9dc131275ad988ef0c696cefe07394"} Sep 30 19:55:58 crc kubenswrapper[4553]: I0930 19:55:58.069666 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:58 crc kubenswrapper[4553]: I0930 19:55:58.139781 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:55:58 crc kubenswrapper[4553]: I0930 19:55:58.313904 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bdk4j"] Sep 30 19:55:59 crc kubenswrapper[4553]: I0930 19:55:59.105748 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:55:59 crc kubenswrapper[4553]: I0930 19:55:59.218458 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5sl6\" (UniqueName: \"kubernetes.io/projected/2da8044c-98b3-4100-88b8-4a933e16d96a-kube-api-access-t5sl6\") pod \"2da8044c-98b3-4100-88b8-4a933e16d96a\" (UID: \"2da8044c-98b3-4100-88b8-4a933e16d96a\") " Sep 30 19:55:59 crc kubenswrapper[4553]: I0930 19:55:59.218729 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2da8044c-98b3-4100-88b8-4a933e16d96a-host\") pod \"2da8044c-98b3-4100-88b8-4a933e16d96a\" (UID: \"2da8044c-98b3-4100-88b8-4a933e16d96a\") " Sep 30 19:55:59 crc kubenswrapper[4553]: I0930 19:55:59.219171 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2da8044c-98b3-4100-88b8-4a933e16d96a-host" (OuterVolumeSpecName: "host") pod "2da8044c-98b3-4100-88b8-4a933e16d96a" (UID: "2da8044c-98b3-4100-88b8-4a933e16d96a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:55:59 crc kubenswrapper[4553]: I0930 19:55:59.243233 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da8044c-98b3-4100-88b8-4a933e16d96a-kube-api-access-t5sl6" (OuterVolumeSpecName: "kube-api-access-t5sl6") pod "2da8044c-98b3-4100-88b8-4a933e16d96a" (UID: "2da8044c-98b3-4100-88b8-4a933e16d96a"). InnerVolumeSpecName "kube-api-access-t5sl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:55:59 crc kubenswrapper[4553]: I0930 19:55:59.320599 4553 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2da8044c-98b3-4100-88b8-4a933e16d96a-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:55:59 crc kubenswrapper[4553]: I0930 19:55:59.320631 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5sl6\" (UniqueName: \"kubernetes.io/projected/2da8044c-98b3-4100-88b8-4a933e16d96a-kube-api-access-t5sl6\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.029136 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bdk4j" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="registry-server" containerID="cri-o://b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0" gracePeriod=2 Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.029545 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-k44hw" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.029598 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/crc-debug-k44hw" event={"ID":"2da8044c-98b3-4100-88b8-4a933e16d96a","Type":"ContainerDied","Data":"d34ac609d1d45d67f338bf0fdba0730be18c6900a17315c4c0942e91e53005b8"} Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.029623 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d34ac609d1d45d67f338bf0fdba0730be18c6900a17315c4c0942e91e53005b8" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.468582 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.541736 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjcm5\" (UniqueName: \"kubernetes.io/projected/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-kube-api-access-cjcm5\") pod \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.542752 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-utilities\") pod \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.542833 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-catalog-content\") pod \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\" (UID: \"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5\") " Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.545066 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-utilities" (OuterVolumeSpecName: "utilities") pod "e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" (UID: "e78bdb31-8a2c-44c6-87a9-fdceb1e557a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.545991 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.567549 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-kube-api-access-cjcm5" (OuterVolumeSpecName: "kube-api-access-cjcm5") pod "e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" (UID: "e78bdb31-8a2c-44c6-87a9-fdceb1e557a5"). InnerVolumeSpecName "kube-api-access-cjcm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.601605 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" (UID: "e78bdb31-8a2c-44c6-87a9-fdceb1e557a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.647057 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.647080 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjcm5\" (UniqueName: \"kubernetes.io/projected/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5-kube-api-access-cjcm5\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.935195 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ts89t"] Sep 30 19:56:00 crc kubenswrapper[4553]: E0930 19:56:00.942121 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da8044c-98b3-4100-88b8-4a933e16d96a" containerName="container-00" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.942145 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da8044c-98b3-4100-88b8-4a933e16d96a" containerName="container-00" Sep 30 19:56:00 crc kubenswrapper[4553]: E0930 19:56:00.942169 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="extract-content" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.942175 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="extract-content" Sep 30 19:56:00 crc kubenswrapper[4553]: E0930 19:56:00.942226 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="registry-server" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.942348 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="registry-server" Sep 30 19:56:00 crc kubenswrapper[4553]: E0930 19:56:00.942362 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="extract-utilities" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.942371 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="extract-utilities" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.949367 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da8044c-98b3-4100-88b8-4a933e16d96a" containerName="container-00" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.949404 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerName="registry-server" Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.962506 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts89t"] Sep 30 19:56:00 crc kubenswrapper[4553]: I0930 19:56:00.962621 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.044014 4553 generic.go:334] "Generic (PLEG): container finished" podID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" containerID="b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0" exitCode=0 Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.044082 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdk4j" event={"ID":"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5","Type":"ContainerDied","Data":"b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0"} Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.044116 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bdk4j" event={"ID":"e78bdb31-8a2c-44c6-87a9-fdceb1e557a5","Type":"ContainerDied","Data":"f686427782337af96b7a593dbe24a228372ea36eec70df06e78e2c6b9a11a2c8"} Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.044122 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bdk4j" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.044136 4553 scope.go:117] "RemoveContainer" containerID="b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.069156 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn69z\" (UniqueName: \"kubernetes.io/projected/a3ee2711-44a8-4de2-97bd-cf997c2974e9-kube-api-access-jn69z\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.069199 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-catalog-content\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.069233 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-utilities\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.071793 4553 scope.go:117] "RemoveContainer" containerID="82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.110539 4553 scope.go:117] "RemoveContainer" containerID="3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.115213 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bdk4j"] Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.139673 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bdk4j"] Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.152186 4553 scope.go:117] "RemoveContainer" containerID="b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0" Sep 30 19:56:01 crc kubenswrapper[4553]: E0930 19:56:01.155228 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0\": container with ID starting with b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0 not found: ID does not exist" containerID="b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.155352 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0"} err="failed to get container status \"b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0\": rpc error: code = NotFound desc = could not find container \"b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0\": container with ID starting with b810cc84501916564e07455db5a5d51c6a837421348b009fc6f37bb0fb7490c0 not found: ID does not exist" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.155457 4553 scope.go:117] "RemoveContainer" containerID="82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824" Sep 30 19:56:01 crc kubenswrapper[4553]: E0930 19:56:01.155968 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824\": container with ID starting with 82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824 not found: ID does not exist" containerID="82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.156090 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824"} err="failed to get container status \"82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824\": rpc error: code = NotFound desc = could not find container \"82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824\": container with ID starting with 82c97bfab2922b765a71a6e6d47e693b2ef014d3f715967e6cd9736c6bcde824 not found: ID does not exist" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.156172 4553 scope.go:117] "RemoveContainer" containerID="3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0" Sep 30 19:56:01 crc kubenswrapper[4553]: E0930 19:56:01.159675 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0\": container with ID starting with 3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0 not found: ID does not exist" containerID="3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.159715 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0"} err="failed to get container status \"3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0\": rpc error: code = NotFound desc = could not find container \"3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0\": container with ID starting with 3e19000604a8036c0a2679ea1829bf73a5c05d3d343e85765ef665e10bba0cf0 not found: ID does not exist" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.170517 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn69z\" (UniqueName: \"kubernetes.io/projected/a3ee2711-44a8-4de2-97bd-cf997c2974e9-kube-api-access-jn69z\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.170591 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-catalog-content\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.170628 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-utilities\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.171075 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-utilities\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.171511 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-catalog-content\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.195132 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn69z\" (UniqueName: \"kubernetes.io/projected/a3ee2711-44a8-4de2-97bd-cf997c2974e9-kube-api-access-jn69z\") pod \"redhat-marketplace-ts89t\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.282453 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.533725 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e78bdb31-8a2c-44c6-87a9-fdceb1e557a5" path="/var/lib/kubelet/pods/e78bdb31-8a2c-44c6-87a9-fdceb1e557a5/volumes" Sep 30 19:56:01 crc kubenswrapper[4553]: I0930 19:56:01.813877 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts89t"] Sep 30 19:56:02 crc kubenswrapper[4553]: I0930 19:56:02.059170 4553 generic.go:334] "Generic (PLEG): container finished" podID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerID="96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0" exitCode=0 Sep 30 19:56:02 crc kubenswrapper[4553]: I0930 19:56:02.059209 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts89t" event={"ID":"a3ee2711-44a8-4de2-97bd-cf997c2974e9","Type":"ContainerDied","Data":"96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0"} Sep 30 19:56:02 crc kubenswrapper[4553]: I0930 19:56:02.059231 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts89t" event={"ID":"a3ee2711-44a8-4de2-97bd-cf997c2974e9","Type":"ContainerStarted","Data":"e4beb71f95e32a36fc9688e7e73ae1b73afd524a5aff76948f34f9da6946d471"} Sep 30 19:56:02 crc kubenswrapper[4553]: I0930 19:56:02.502210 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-k44hw"] Sep 30 19:56:02 crc kubenswrapper[4553]: I0930 19:56:02.511241 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-k44hw"] Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.085065 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts89t" event={"ID":"a3ee2711-44a8-4de2-97bd-cf997c2974e9","Type":"ContainerStarted","Data":"4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431"} Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.524950 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da8044c-98b3-4100-88b8-4a933e16d96a" path="/var/lib/kubelet/pods/2da8044c-98b3-4100-88b8-4a933e16d96a/volumes" Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.730481 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-9crdr"] Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.732013 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.736085 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4hfkh"/"default-dockercfg-rxxz9" Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.829084 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-host\") pod \"crc-debug-9crdr\" (UID: \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\") " pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.829171 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg845\" (UniqueName: \"kubernetes.io/projected/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-kube-api-access-tg845\") pod \"crc-debug-9crdr\" (UID: \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\") " pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.930920 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg845\" (UniqueName: \"kubernetes.io/projected/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-kube-api-access-tg845\") pod \"crc-debug-9crdr\" (UID: \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\") " pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.931175 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-host\") pod \"crc-debug-9crdr\" (UID: \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\") " pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.931309 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-host\") pod \"crc-debug-9crdr\" (UID: \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\") " pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:03 crc kubenswrapper[4553]: I0930 19:56:03.959894 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg845\" (UniqueName: \"kubernetes.io/projected/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-kube-api-access-tg845\") pod \"crc-debug-9crdr\" (UID: \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\") " pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:04 crc kubenswrapper[4553]: I0930 19:56:04.047109 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:04 crc kubenswrapper[4553]: W0930 19:56:04.075500 4553 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e6595a8_ef8b_4e16_a9e3_ece9b0e692f5.slice/crio-1c89cd8692c7677541454a6047b5c914b910c1943a9b0355c1fb5bdceb914dc6 WatchSource:0}: Error finding container 1c89cd8692c7677541454a6047b5c914b910c1943a9b0355c1fb5bdceb914dc6: Status 404 returned error can't find the container with id 1c89cd8692c7677541454a6047b5c914b910c1943a9b0355c1fb5bdceb914dc6 Sep 30 19:56:04 crc kubenswrapper[4553]: I0930 19:56:04.093376 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/crc-debug-9crdr" event={"ID":"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5","Type":"ContainerStarted","Data":"1c89cd8692c7677541454a6047b5c914b910c1943a9b0355c1fb5bdceb914dc6"} Sep 30 19:56:04 crc kubenswrapper[4553]: I0930 19:56:04.094647 4553 generic.go:334] "Generic (PLEG): container finished" podID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerID="4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431" exitCode=0 Sep 30 19:56:04 crc kubenswrapper[4553]: I0930 19:56:04.094676 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts89t" event={"ID":"a3ee2711-44a8-4de2-97bd-cf997c2974e9","Type":"ContainerDied","Data":"4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431"} Sep 30 19:56:05 crc kubenswrapper[4553]: I0930 19:56:05.110166 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts89t" event={"ID":"a3ee2711-44a8-4de2-97bd-cf997c2974e9","Type":"ContainerStarted","Data":"af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8"} Sep 30 19:56:05 crc kubenswrapper[4553]: I0930 19:56:05.112853 4553 generic.go:334] "Generic (PLEG): container finished" podID="8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5" containerID="e5e486b93606f3670198af23eba25c072418babd254fd7bfddfaa5da9d98994c" exitCode=0 Sep 30 19:56:05 crc kubenswrapper[4553]: I0930 19:56:05.112920 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/crc-debug-9crdr" event={"ID":"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5","Type":"ContainerDied","Data":"e5e486b93606f3670198af23eba25c072418babd254fd7bfddfaa5da9d98994c"} Sep 30 19:56:05 crc kubenswrapper[4553]: I0930 19:56:05.153166 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ts89t" podStartSLOduration=2.4285764 podStartE2EDuration="5.153146602s" podCreationTimestamp="2025-09-30 19:56:00 +0000 UTC" firstStartedPulling="2025-09-30 19:56:02.064408778 +0000 UTC m=+1415.263910908" lastFinishedPulling="2025-09-30 19:56:04.78897894 +0000 UTC m=+1417.988481110" observedRunningTime="2025-09-30 19:56:05.143518044 +0000 UTC m=+1418.343020204" watchObservedRunningTime="2025-09-30 19:56:05.153146602 +0000 UTC m=+1418.352648742" Sep 30 19:56:05 crc kubenswrapper[4553]: I0930 19:56:05.207983 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-9crdr"] Sep 30 19:56:05 crc kubenswrapper[4553]: I0930 19:56:05.218650 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4hfkh/crc-debug-9crdr"] Sep 30 19:56:06 crc kubenswrapper[4553]: I0930 19:56:06.210062 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:06 crc kubenswrapper[4553]: I0930 19:56:06.270369 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-host\") pod \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\" (UID: \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\") " Sep 30 19:56:06 crc kubenswrapper[4553]: I0930 19:56:06.270460 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-host" (OuterVolumeSpecName: "host") pod "8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5" (UID: "8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 19:56:06 crc kubenswrapper[4553]: I0930 19:56:06.270481 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg845\" (UniqueName: \"kubernetes.io/projected/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-kube-api-access-tg845\") pod \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\" (UID: \"8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5\") " Sep 30 19:56:06 crc kubenswrapper[4553]: I0930 19:56:06.271411 4553 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-host\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:06 crc kubenswrapper[4553]: I0930 19:56:06.280821 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-kube-api-access-tg845" (OuterVolumeSpecName: "kube-api-access-tg845") pod "8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5" (UID: "8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5"). InnerVolumeSpecName "kube-api-access-tg845". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:56:06 crc kubenswrapper[4553]: I0930 19:56:06.372771 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg845\" (UniqueName: \"kubernetes.io/projected/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5-kube-api-access-tg845\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.006010 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj_5aa35519-bdc4-4eb7-a039-7238829d51ac/util/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.128970 4553 scope.go:117] "RemoveContainer" containerID="e5e486b93606f3670198af23eba25c072418babd254fd7bfddfaa5da9d98994c" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.129065 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/crc-debug-9crdr" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.198140 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj_5aa35519-bdc4-4eb7-a039-7238829d51ac/util/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.216604 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj_5aa35519-bdc4-4eb7-a039-7238829d51ac/pull/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.216828 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj_5aa35519-bdc4-4eb7-a039-7238829d51ac/pull/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.417114 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj_5aa35519-bdc4-4eb7-a039-7238829d51ac/util/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.418552 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj_5aa35519-bdc4-4eb7-a039-7238829d51ac/pull/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.418885 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b06609af47042a6f8e2d87c80942a2e7fa4642ad364cacb64cd797b3dmhnj_5aa35519-bdc4-4eb7-a039-7238829d51ac/extract/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.514657 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5" path="/var/lib/kubelet/pods/8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5/volumes" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.627101 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-r2zwk_aebfd6cd-5a72-4797-b16a-492efaa1016e/kube-rbac-proxy/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.657886 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-r2zwk_aebfd6cd-5a72-4797-b16a-492efaa1016e/manager/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.704483 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-r7vxh_a38189f9-08d1-4f4b-8949-4856b0f46d95/kube-rbac-proxy/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.850193 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-r7vxh_a38189f9-08d1-4f4b-8949-4856b0f46d95/manager/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.910161 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-5k8k5_485f984b-4520-4753-b6e7-4584137d3d58/kube-rbac-proxy/0.log" Sep 30 19:56:07 crc kubenswrapper[4553]: I0930 19:56:07.945413 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-5k8k5_485f984b-4520-4753-b6e7-4584137d3d58/manager/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.115019 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-xk7kj_a0036f14-fb94-4336-9e0b-d501cd080bd5/kube-rbac-proxy/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.204592 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-xk7kj_a0036f14-fb94-4336-9e0b-d501cd080bd5/manager/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.322320 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-ghxx6_2339ca48-ee02-4443-a1fd-4ae2456f6569/kube-rbac-proxy/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.354796 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-ghxx6_2339ca48-ee02-4443-a1fd-4ae2456f6569/manager/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.498487 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-9bbsc_fd82e0b0-7700-49a2-9a07-2695b2ffe2fc/kube-rbac-proxy/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.609922 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-9bbsc_fd82e0b0-7700-49a2-9a07-2695b2ffe2fc/manager/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.843015 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-tk4kk_1b3e5dca-afd2-42de-a39a-e4e6fda92e90/kube-rbac-proxy/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.941565 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-tk4kk_1b3e5dca-afd2-42de-a39a-e4e6fda92e90/manager/0.log" Sep 30 19:56:08 crc kubenswrapper[4553]: I0930 19:56:08.995450 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-s8nwm_57489c87-d763-4cf2-a2c6-fd03b1ec7131/kube-rbac-proxy/0.log" Sep 30 19:56:09 crc kubenswrapper[4553]: I0930 19:56:09.405415 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-s8nwm_57489c87-d763-4cf2-a2c6-fd03b1ec7131/manager/0.log" Sep 30 19:56:09 crc kubenswrapper[4553]: I0930 19:56:09.450744 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-sjwqp_e973f7e5-4256-4b75-8f51-e01ca131eeca/kube-rbac-proxy/0.log" Sep 30 19:56:09 crc kubenswrapper[4553]: I0930 19:56:09.602178 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-sjwqp_e973f7e5-4256-4b75-8f51-e01ca131eeca/manager/0.log" Sep 30 19:56:09 crc kubenswrapper[4553]: I0930 19:56:09.661643 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-4sf9s_48e38a70-37cf-4efe-bac1-e0fe7b196b22/kube-rbac-proxy/0.log" Sep 30 19:56:09 crc kubenswrapper[4553]: I0930 19:56:09.712646 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-4sf9s_48e38a70-37cf-4efe-bac1-e0fe7b196b22/manager/0.log" Sep 30 19:56:09 crc kubenswrapper[4553]: I0930 19:56:09.943414 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-qhv4n_081051c3-9106-4b8f-8850-42facfbb5583/kube-rbac-proxy/0.log" Sep 30 19:56:09 crc kubenswrapper[4553]: I0930 19:56:09.954362 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-qhv4n_081051c3-9106-4b8f-8850-42facfbb5583/manager/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.136146 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-jl46p_fad8e76f-5b93-44c1-98d2-3f6f756cc23c/kube-rbac-proxy/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.228409 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-jl46p_fad8e76f-5b93-44c1-98d2-3f6f756cc23c/manager/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.336728 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-5lc25_e6fe293b-17b2-40c1-ac31-e456a23355b9/kube-rbac-proxy/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.478067 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-5lc25_e6fe293b-17b2-40c1-ac31-e456a23355b9/manager/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.484984 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-lgc4h_11181f5a-47aa-4d9b-b3eb-b6c5868bed4b/kube-rbac-proxy/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.598980 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-lgc4h_11181f5a-47aa-4d9b-b3eb-b6c5868bed4b/manager/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.703402 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8crsqm8_2f1179b9-fc96-402c-9387-7fb33c26a489/kube-rbac-proxy/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.742683 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8crsqm8_2f1179b9-fc96-402c-9387-7fb33c26a489/manager/0.log" Sep 30 19:56:10 crc kubenswrapper[4553]: I0930 19:56:10.987403 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-98d66ccb9-2pvxf_2840394a-f6dd-4890-aec6-aab3f4f9eaba/kube-rbac-proxy/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.098613 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-67dd46bc9f-2kzlg_7c599bbe-1b64-4563-b235-5f2c58d234b5/kube-rbac-proxy/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.283272 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.283315 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.311641 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-67dd46bc9f-2kzlg_7c599bbe-1b64-4563-b235-5f2c58d234b5/operator/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.355875 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6w8vg_b9037a2f-77dd-423b-9d81-432c9a554e15/registry-server/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.359359 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.558083 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-9b2qm_6d78a774-042e-4b7b-9988-971454080ca0/kube-rbac-proxy/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.591534 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-98d66ccb9-2pvxf_2840394a-f6dd-4890-aec6-aab3f4f9eaba/manager/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.664584 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-9b2qm_6d78a774-042e-4b7b-9988-971454080ca0/manager/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.728702 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-grf94_811049e5-2659-408a-9370-77fe827766e1/kube-rbac-proxy/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.828632 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-grf94_811049e5-2659-408a-9370-77fe827766e1/manager/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.885157 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-nbccn_899fda30-4ef6-499d-961b-6e23466c55e3/operator/0.log" Sep 30 19:56:11 crc kubenswrapper[4553]: I0930 19:56:11.929177 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-8cnk4_e5168b35-e85c-47e3-a641-e7003a2dbae7/kube-rbac-proxy/0.log" Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.048302 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-8cnk4_e5168b35-e85c-47e3-a641-e7003a2dbae7/manager/0.log" Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.156597 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-hqq9l_24a84108-9502-4a67-8452-7ebdf2e358ed/kube-rbac-proxy/0.log" Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.194926 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-hqq9l_24a84108-9502-4a67-8452-7ebdf2e358ed/manager/0.log" Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.214280 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.257060 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts89t"] Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.320233 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-m5mwg_0f072097-ae5a-4f90-86bb-8308893409d4/kube-rbac-proxy/0.log" Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.393654 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-m5mwg_0f072097-ae5a-4f90-86bb-8308893409d4/manager/0.log" Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.419325 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-kgtt2_826f8aa9-d307-4845-8e61-dc907c69a18c/kube-rbac-proxy/0.log" Sep 30 19:56:12 crc kubenswrapper[4553]: I0930 19:56:12.464822 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-kgtt2_826f8aa9-d307-4845-8e61-dc907c69a18c/manager/0.log" Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.184115 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ts89t" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerName="registry-server" containerID="cri-o://af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8" gracePeriod=2 Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.635394 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.710344 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-catalog-content\") pod \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.710428 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-utilities\") pod \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.710513 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn69z\" (UniqueName: \"kubernetes.io/projected/a3ee2711-44a8-4de2-97bd-cf997c2974e9-kube-api-access-jn69z\") pod \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\" (UID: \"a3ee2711-44a8-4de2-97bd-cf997c2974e9\") " Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.711496 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-utilities" (OuterVolumeSpecName: "utilities") pod "a3ee2711-44a8-4de2-97bd-cf997c2974e9" (UID: "a3ee2711-44a8-4de2-97bd-cf997c2974e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.719183 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ee2711-44a8-4de2-97bd-cf997c2974e9-kube-api-access-jn69z" (OuterVolumeSpecName: "kube-api-access-jn69z") pod "a3ee2711-44a8-4de2-97bd-cf997c2974e9" (UID: "a3ee2711-44a8-4de2-97bd-cf997c2974e9"). InnerVolumeSpecName "kube-api-access-jn69z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.724882 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3ee2711-44a8-4de2-97bd-cf997c2974e9" (UID: "a3ee2711-44a8-4de2-97bd-cf997c2974e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.812904 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.812948 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn69z\" (UniqueName: \"kubernetes.io/projected/a3ee2711-44a8-4de2-97bd-cf997c2974e9-kube-api-access-jn69z\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:14 crc kubenswrapper[4553]: I0930 19:56:14.812965 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3ee2711-44a8-4de2-97bd-cf997c2974e9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.194924 4553 generic.go:334] "Generic (PLEG): container finished" podID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerID="af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8" exitCode=0 Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.194985 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ts89t" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.195005 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts89t" event={"ID":"a3ee2711-44a8-4de2-97bd-cf997c2974e9","Type":"ContainerDied","Data":"af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8"} Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.195363 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ts89t" event={"ID":"a3ee2711-44a8-4de2-97bd-cf997c2974e9","Type":"ContainerDied","Data":"e4beb71f95e32a36fc9688e7e73ae1b73afd524a5aff76948f34f9da6946d471"} Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.195381 4553 scope.go:117] "RemoveContainer" containerID="af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.219452 4553 scope.go:117] "RemoveContainer" containerID="4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.232334 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts89t"] Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.240881 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ts89t"] Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.265239 4553 scope.go:117] "RemoveContainer" containerID="96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.294896 4553 scope.go:117] "RemoveContainer" containerID="af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8" Sep 30 19:56:15 crc kubenswrapper[4553]: E0930 19:56:15.299997 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8\": container with ID starting with af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8 not found: ID does not exist" containerID="af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.300047 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8"} err="failed to get container status \"af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8\": rpc error: code = NotFound desc = could not find container \"af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8\": container with ID starting with af348938d03b7e4c859b97e37da9b81822e352b4aabbe041ecfedb3636be45e8 not found: ID does not exist" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.300072 4553 scope.go:117] "RemoveContainer" containerID="4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431" Sep 30 19:56:15 crc kubenswrapper[4553]: E0930 19:56:15.300647 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431\": container with ID starting with 4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431 not found: ID does not exist" containerID="4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.300699 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431"} err="failed to get container status \"4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431\": rpc error: code = NotFound desc = could not find container \"4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431\": container with ID starting with 4717faa4db12a01ce3520138983bc2c3097d6de4d06054c633a148d37b3e2431 not found: ID does not exist" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.300729 4553 scope.go:117] "RemoveContainer" containerID="96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0" Sep 30 19:56:15 crc kubenswrapper[4553]: E0930 19:56:15.301065 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0\": container with ID starting with 96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0 not found: ID does not exist" containerID="96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.301286 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0"} err="failed to get container status \"96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0\": rpc error: code = NotFound desc = could not find container \"96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0\": container with ID starting with 96cf4fae185f2064029e0b5624b830d42eea583049a22e5da950c0e72b4a56b0 not found: ID does not exist" Sep 30 19:56:15 crc kubenswrapper[4553]: I0930 19:56:15.513284 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" path="/var/lib/kubelet/pods/a3ee2711-44a8-4de2-97bd-cf997c2974e9/volumes" Sep 30 19:56:29 crc kubenswrapper[4553]: I0930 19:56:29.842582 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xz69w_51d0f112-9b15-4602-b8a6-16e79dfeb4cb/control-plane-machine-set-operator/0.log" Sep 30 19:56:30 crc kubenswrapper[4553]: I0930 19:56:30.031588 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-z7r6d_fd13fb01-b6ec-486e-8b39-7440a349ae64/kube-rbac-proxy/0.log" Sep 30 19:56:30 crc kubenswrapper[4553]: I0930 19:56:30.148795 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-z7r6d_fd13fb01-b6ec-486e-8b39-7440a349ae64/machine-api-operator/0.log" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.026548 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7rsq8"] Sep 30 19:56:38 crc kubenswrapper[4553]: E0930 19:56:38.028785 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5" containerName="container-00" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.028886 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5" containerName="container-00" Sep 30 19:56:38 crc kubenswrapper[4553]: E0930 19:56:38.028965 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerName="registry-server" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.029030 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerName="registry-server" Sep 30 19:56:38 crc kubenswrapper[4553]: E0930 19:56:38.029143 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerName="extract-content" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.029223 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerName="extract-content" Sep 30 19:56:38 crc kubenswrapper[4553]: E0930 19:56:38.029327 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerName="extract-utilities" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.029406 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerName="extract-utilities" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.029711 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6595a8-ef8b-4e16-a9e3-ece9b0e692f5" containerName="container-00" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.029818 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3ee2711-44a8-4de2-97bd-cf997c2974e9" containerName="registry-server" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.031512 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.051127 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7rsq8"] Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.123912 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-utilities\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.124245 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-catalog-content\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.124393 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7j9\" (UniqueName: \"kubernetes.io/projected/a3251305-58ed-4b31-a523-66ba99240ec1-kube-api-access-zq7j9\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.225654 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7j9\" (UniqueName: \"kubernetes.io/projected/a3251305-58ed-4b31-a523-66ba99240ec1-kube-api-access-zq7j9\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.225753 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-utilities\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.225812 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-catalog-content\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.226332 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-catalog-content\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.226355 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-utilities\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.248168 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7j9\" (UniqueName: \"kubernetes.io/projected/a3251305-58ed-4b31-a523-66ba99240ec1-kube-api-access-zq7j9\") pod \"community-operators-7rsq8\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.364008 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:38 crc kubenswrapper[4553]: I0930 19:56:38.987449 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7rsq8"] Sep 30 19:56:39 crc kubenswrapper[4553]: I0930 19:56:39.443725 4553 generic.go:334] "Generic (PLEG): container finished" podID="a3251305-58ed-4b31-a523-66ba99240ec1" containerID="ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed" exitCode=0 Sep 30 19:56:39 crc kubenswrapper[4553]: I0930 19:56:39.443920 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rsq8" event={"ID":"a3251305-58ed-4b31-a523-66ba99240ec1","Type":"ContainerDied","Data":"ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed"} Sep 30 19:56:39 crc kubenswrapper[4553]: I0930 19:56:39.444031 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rsq8" event={"ID":"a3251305-58ed-4b31-a523-66ba99240ec1","Type":"ContainerStarted","Data":"613deb4648424d1fcea09cb5c9ad53092e8a0f62fcd2881f82726adb204f254e"} Sep 30 19:56:40 crc kubenswrapper[4553]: I0930 19:56:40.455951 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rsq8" event={"ID":"a3251305-58ed-4b31-a523-66ba99240ec1","Type":"ContainerStarted","Data":"01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97"} Sep 30 19:56:42 crc kubenswrapper[4553]: I0930 19:56:42.479743 4553 generic.go:334] "Generic (PLEG): container finished" podID="a3251305-58ed-4b31-a523-66ba99240ec1" containerID="01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97" exitCode=0 Sep 30 19:56:42 crc kubenswrapper[4553]: I0930 19:56:42.479789 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rsq8" event={"ID":"a3251305-58ed-4b31-a523-66ba99240ec1","Type":"ContainerDied","Data":"01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97"} Sep 30 19:56:43 crc kubenswrapper[4553]: I0930 19:56:43.494078 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rsq8" event={"ID":"a3251305-58ed-4b31-a523-66ba99240ec1","Type":"ContainerStarted","Data":"07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3"} Sep 30 19:56:43 crc kubenswrapper[4553]: I0930 19:56:43.521271 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7rsq8" podStartSLOduration=2.009978012 podStartE2EDuration="5.521252065s" podCreationTimestamp="2025-09-30 19:56:38 +0000 UTC" firstStartedPulling="2025-09-30 19:56:39.445584387 +0000 UTC m=+1452.645086527" lastFinishedPulling="2025-09-30 19:56:42.95685846 +0000 UTC m=+1456.156360580" observedRunningTime="2025-09-30 19:56:43.513438995 +0000 UTC m=+1456.712941125" watchObservedRunningTime="2025-09-30 19:56:43.521252065 +0000 UTC m=+1456.720754195" Sep 30 19:56:44 crc kubenswrapper[4553]: I0930 19:56:44.761814 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-wr26z_aa85437b-afbc-4b69-8e83-a4138eb4c992/cert-manager-controller/0.log" Sep 30 19:56:45 crc kubenswrapper[4553]: I0930 19:56:45.183581 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-sht4s_9a08fdbf-1c77-41d2-9f95-97d8a44f709c/cert-manager-webhook/0.log" Sep 30 19:56:45 crc kubenswrapper[4553]: I0930 19:56:45.228903 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-c9dqh_f3dcc7e7-e268-44e4-bff9-83d283661835/cert-manager-cainjector/0.log" Sep 30 19:56:48 crc kubenswrapper[4553]: I0930 19:56:48.364581 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:48 crc kubenswrapper[4553]: I0930 19:56:48.365244 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:48 crc kubenswrapper[4553]: I0930 19:56:48.441019 4553 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:48 crc kubenswrapper[4553]: I0930 19:56:48.591129 4553 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:48 crc kubenswrapper[4553]: I0930 19:56:48.680573 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7rsq8"] Sep 30 19:56:50 crc kubenswrapper[4553]: I0930 19:56:50.554955 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7rsq8" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" containerName="registry-server" containerID="cri-o://07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3" gracePeriod=2 Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.022693 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.161758 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-utilities\") pod \"a3251305-58ed-4b31-a523-66ba99240ec1\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.161958 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq7j9\" (UniqueName: \"kubernetes.io/projected/a3251305-58ed-4b31-a523-66ba99240ec1-kube-api-access-zq7j9\") pod \"a3251305-58ed-4b31-a523-66ba99240ec1\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.162137 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-catalog-content\") pod \"a3251305-58ed-4b31-a523-66ba99240ec1\" (UID: \"a3251305-58ed-4b31-a523-66ba99240ec1\") " Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.164616 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-utilities" (OuterVolumeSpecName: "utilities") pod "a3251305-58ed-4b31-a523-66ba99240ec1" (UID: "a3251305-58ed-4b31-a523-66ba99240ec1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.172363 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3251305-58ed-4b31-a523-66ba99240ec1-kube-api-access-zq7j9" (OuterVolumeSpecName: "kube-api-access-zq7j9") pod "a3251305-58ed-4b31-a523-66ba99240ec1" (UID: "a3251305-58ed-4b31-a523-66ba99240ec1"). InnerVolumeSpecName "kube-api-access-zq7j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.215870 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3251305-58ed-4b31-a523-66ba99240ec1" (UID: "a3251305-58ed-4b31-a523-66ba99240ec1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.263957 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq7j9\" (UniqueName: \"kubernetes.io/projected/a3251305-58ed-4b31-a523-66ba99240ec1-kube-api-access-zq7j9\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.263987 4553 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.264001 4553 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3251305-58ed-4b31-a523-66ba99240ec1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.566258 4553 generic.go:334] "Generic (PLEG): container finished" podID="a3251305-58ed-4b31-a523-66ba99240ec1" containerID="07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3" exitCode=0 Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.566305 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rsq8" event={"ID":"a3251305-58ed-4b31-a523-66ba99240ec1","Type":"ContainerDied","Data":"07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3"} Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.566337 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7rsq8" event={"ID":"a3251305-58ed-4b31-a523-66ba99240ec1","Type":"ContainerDied","Data":"613deb4648424d1fcea09cb5c9ad53092e8a0f62fcd2881f82726adb204f254e"} Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.566358 4553 scope.go:117] "RemoveContainer" containerID="07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.566375 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7rsq8" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.589492 4553 scope.go:117] "RemoveContainer" containerID="01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.606359 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7rsq8"] Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.640535 4553 scope.go:117] "RemoveContainer" containerID="ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.643616 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7rsq8"] Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.718482 4553 scope.go:117] "RemoveContainer" containerID="07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3" Sep 30 19:56:51 crc kubenswrapper[4553]: E0930 19:56:51.726708 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3\": container with ID starting with 07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3 not found: ID does not exist" containerID="07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.726747 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3"} err="failed to get container status \"07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3\": rpc error: code = NotFound desc = could not find container \"07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3\": container with ID starting with 07d2cf229e4ad595292a1b461f748a361df1183a7b98f4af575c882620206af3 not found: ID does not exist" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.726772 4553 scope.go:117] "RemoveContainer" containerID="01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97" Sep 30 19:56:51 crc kubenswrapper[4553]: E0930 19:56:51.727389 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97\": container with ID starting with 01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97 not found: ID does not exist" containerID="01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.727411 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97"} err="failed to get container status \"01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97\": rpc error: code = NotFound desc = could not find container \"01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97\": container with ID starting with 01ebc8ce81873788aad739391694b9c3a8214f925b70d6c7aa6be08946d1fa97 not found: ID does not exist" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.727425 4553 scope.go:117] "RemoveContainer" containerID="ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed" Sep 30 19:56:51 crc kubenswrapper[4553]: E0930 19:56:51.727641 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed\": container with ID starting with ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed not found: ID does not exist" containerID="ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed" Sep 30 19:56:51 crc kubenswrapper[4553]: I0930 19:56:51.727661 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed"} err="failed to get container status \"ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed\": rpc error: code = NotFound desc = could not find container \"ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed\": container with ID starting with ae19c5889ecb2f74e9fad50bdffc5f76340cd6aa68fe50d4bbda9d0e22d001ed not found: ID does not exist" Sep 30 19:56:53 crc kubenswrapper[4553]: I0930 19:56:53.518391 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" path="/var/lib/kubelet/pods/a3251305-58ed-4b31-a523-66ba99240ec1/volumes" Sep 30 19:56:59 crc kubenswrapper[4553]: I0930 19:56:59.144399 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-4bh2w_f94ac837-cb01-45f9-a065-76734fc913c1/nmstate-console-plugin/0.log" Sep 30 19:56:59 crc kubenswrapper[4553]: I0930 19:56:59.290932 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-cgzk9_18f2ad62-a815-4e7b-95ad-eb0f99c24665/nmstate-handler/0.log" Sep 30 19:56:59 crc kubenswrapper[4553]: I0930 19:56:59.409192 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-b2m5s_c3247d9f-1127-4fa1-9124-815ab13ffadb/kube-rbac-proxy/0.log" Sep 30 19:56:59 crc kubenswrapper[4553]: I0930 19:56:59.452523 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-b2m5s_c3247d9f-1127-4fa1-9124-815ab13ffadb/nmstate-metrics/0.log" Sep 30 19:56:59 crc kubenswrapper[4553]: I0930 19:56:59.585569 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:56:59 crc kubenswrapper[4553]: I0930 19:56:59.585622 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:56:59 crc kubenswrapper[4553]: I0930 19:56:59.652634 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-xxfdp_c549e241-2eff-4e22-8d14-fb0d64873ac2/nmstate-operator/0.log" Sep 30 19:56:59 crc kubenswrapper[4553]: I0930 19:56:59.717284 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-4h579_6ccd3ce9-0e6b-4142-84c8-ce7fba27a760/nmstate-webhook/0.log" Sep 30 19:57:14 crc kubenswrapper[4553]: I0930 19:57:14.496412 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-6v9n6_c43fd261-7524-4dbc-a909-1bbc73e9f658/kube-rbac-proxy/0.log" Sep 30 19:57:14 crc kubenswrapper[4553]: I0930 19:57:14.547533 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-6v9n6_c43fd261-7524-4dbc-a909-1bbc73e9f658/controller/0.log" Sep 30 19:57:14 crc kubenswrapper[4553]: I0930 19:57:14.670852 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-f4rnh_5e06ff00-b19c-4283-baa8-738505ae723f/frr-k8s-webhook-server/0.log" Sep 30 19:57:14 crc kubenswrapper[4553]: I0930 19:57:14.767695 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-frr-files/0.log" Sep 30 19:57:14 crc kubenswrapper[4553]: I0930 19:57:14.933693 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-frr-files/0.log" Sep 30 19:57:14 crc kubenswrapper[4553]: I0930 19:57:14.992922 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-metrics/0.log" Sep 30 19:57:14 crc kubenswrapper[4553]: I0930 19:57:14.995888 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-reloader/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.002458 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-reloader/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.218843 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-metrics/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.218988 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-metrics/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.236581 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-reloader/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.274956 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-frr-files/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.444275 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-frr-files/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.452929 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-reloader/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.475892 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/cp-metrics/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.493309 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/controller/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.692566 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/frr-metrics/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.707949 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/kube-rbac-proxy/0.log" Sep 30 19:57:15 crc kubenswrapper[4553]: I0930 19:57:15.789171 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/kube-rbac-proxy-frr/0.log" Sep 30 19:57:16 crc kubenswrapper[4553]: I0930 19:57:16.045252 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/reloader/0.log" Sep 30 19:57:16 crc kubenswrapper[4553]: I0930 19:57:16.122070 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6dc6c6544f-hp9t6_b2e7636d-a087-4849-9440-0095096c8022/manager/0.log" Sep 30 19:57:16 crc kubenswrapper[4553]: I0930 19:57:16.262266 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-845c9f75c7-nj9qh_c6c9439c-02e4-4e1d-8eca-27dfc7b0b127/webhook-server/0.log" Sep 30 19:57:16 crc kubenswrapper[4553]: I0930 19:57:16.404350 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-xpxlt_ff5f7ae5-5dae-4d60-8202-f3e1cf0546d8/frr/0.log" Sep 30 19:57:16 crc kubenswrapper[4553]: I0930 19:57:16.497998 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5k2m2_c810c77e-e85f-4932-aac6-45dc8419540b/kube-rbac-proxy/0.log" Sep 30 19:57:16 crc kubenswrapper[4553]: I0930 19:57:16.767334 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5k2m2_c810c77e-e85f-4932-aac6-45dc8419540b/speaker/0.log" Sep 30 19:57:29 crc kubenswrapper[4553]: I0930 19:57:29.584807 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:57:29 crc kubenswrapper[4553]: I0930 19:57:29.585344 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:57:29 crc kubenswrapper[4553]: I0930 19:57:29.802390 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2_04da03ab-c17c-40e3-ab34-524cda37de29/util/0.log" Sep 30 19:57:29 crc kubenswrapper[4553]: I0930 19:57:29.929675 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2_04da03ab-c17c-40e3-ab34-524cda37de29/util/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.003616 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2_04da03ab-c17c-40e3-ab34-524cda37de29/pull/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.040488 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2_04da03ab-c17c-40e3-ab34-524cda37de29/pull/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.150381 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2_04da03ab-c17c-40e3-ab34-524cda37de29/util/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.178125 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2_04da03ab-c17c-40e3-ab34-524cda37de29/pull/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.200762 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bczpms2_04da03ab-c17c-40e3-ab34-524cda37de29/extract/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.370615 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsgr5_621efb6a-40a5-416f-a473-4bf9e8837b76/extract-utilities/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.532849 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsgr5_621efb6a-40a5-416f-a473-4bf9e8837b76/extract-content/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.574476 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsgr5_621efb6a-40a5-416f-a473-4bf9e8837b76/extract-utilities/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.582437 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsgr5_621efb6a-40a5-416f-a473-4bf9e8837b76/extract-content/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.711981 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsgr5_621efb6a-40a5-416f-a473-4bf9e8837b76/extract-content/0.log" Sep 30 19:57:30 crc kubenswrapper[4553]: I0930 19:57:30.736912 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsgr5_621efb6a-40a5-416f-a473-4bf9e8837b76/extract-utilities/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.002908 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q6qnb_044b8190-3a71-4b25-a654-8087bbacd1fd/extract-utilities/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.108063 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rsgr5_621efb6a-40a5-416f-a473-4bf9e8837b76/registry-server/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.152681 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q6qnb_044b8190-3a71-4b25-a654-8087bbacd1fd/extract-content/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.157713 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q6qnb_044b8190-3a71-4b25-a654-8087bbacd1fd/extract-utilities/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.229289 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q6qnb_044b8190-3a71-4b25-a654-8087bbacd1fd/extract-content/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.375955 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q6qnb_044b8190-3a71-4b25-a654-8087bbacd1fd/extract-utilities/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.462418 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q6qnb_044b8190-3a71-4b25-a654-8087bbacd1fd/extract-content/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.601844 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff_841a6dbb-567d-429f-9096-23b69f7b9e5f/util/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.671576 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-q6qnb_044b8190-3a71-4b25-a654-8087bbacd1fd/registry-server/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.866313 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff_841a6dbb-567d-429f-9096-23b69f7b9e5f/util/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.869756 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff_841a6dbb-567d-429f-9096-23b69f7b9e5f/pull/0.log" Sep 30 19:57:31 crc kubenswrapper[4553]: I0930 19:57:31.916550 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff_841a6dbb-567d-429f-9096-23b69f7b9e5f/pull/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.073969 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff_841a6dbb-567d-429f-9096-23b69f7b9e5f/util/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.113452 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff_841a6dbb-567d-429f-9096-23b69f7b9e5f/pull/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.140648 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d9676lff_841a6dbb-567d-429f-9096-23b69f7b9e5f/extract/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.245406 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lgw6c_c17292c5-31e1-4fd3-80cc-a635a1ee1348/marketplace-operator/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.334979 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-58c2w_85b5e9a0-50cb-48f9-beb9-ecd2b1995370/extract-utilities/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.531849 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-58c2w_85b5e9a0-50cb-48f9-beb9-ecd2b1995370/extract-utilities/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.546673 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-58c2w_85b5e9a0-50cb-48f9-beb9-ecd2b1995370/extract-content/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.560543 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-58c2w_85b5e9a0-50cb-48f9-beb9-ecd2b1995370/extract-content/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.664401 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-58c2w_85b5e9a0-50cb-48f9-beb9-ecd2b1995370/extract-utilities/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.701532 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-58c2w_85b5e9a0-50cb-48f9-beb9-ecd2b1995370/extract-content/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.783273 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-58c2w_85b5e9a0-50cb-48f9-beb9-ecd2b1995370/registry-server/0.log" Sep 30 19:57:32 crc kubenswrapper[4553]: I0930 19:57:32.978315 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpccw_93b0562f-066d-4491-a3a8-5b3d36463f49/extract-utilities/0.log" Sep 30 19:57:33 crc kubenswrapper[4553]: I0930 19:57:33.095842 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpccw_93b0562f-066d-4491-a3a8-5b3d36463f49/extract-content/0.log" Sep 30 19:57:33 crc kubenswrapper[4553]: I0930 19:57:33.101196 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpccw_93b0562f-066d-4491-a3a8-5b3d36463f49/extract-utilities/0.log" Sep 30 19:57:33 crc kubenswrapper[4553]: I0930 19:57:33.113275 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpccw_93b0562f-066d-4491-a3a8-5b3d36463f49/extract-content/0.log" Sep 30 19:57:33 crc kubenswrapper[4553]: I0930 19:57:33.270094 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpccw_93b0562f-066d-4491-a3a8-5b3d36463f49/extract-content/0.log" Sep 30 19:57:33 crc kubenswrapper[4553]: I0930 19:57:33.284284 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpccw_93b0562f-066d-4491-a3a8-5b3d36463f49/extract-utilities/0.log" Sep 30 19:57:33 crc kubenswrapper[4553]: I0930 19:57:33.495473 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-kpccw_93b0562f-066d-4491-a3a8-5b3d36463f49/registry-server/0.log" Sep 30 19:57:56 crc kubenswrapper[4553]: I0930 19:57:56.080777 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8vprf"] Sep 30 19:57:56 crc kubenswrapper[4553]: I0930 19:57:56.087878 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8vprf"] Sep 30 19:57:57 crc kubenswrapper[4553]: I0930 19:57:57.546239 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c23598-c1a7-4544-9204-f071ac589644" path="/var/lib/kubelet/pods/f1c23598-c1a7-4544-9204-f071ac589644/volumes" Sep 30 19:57:59 crc kubenswrapper[4553]: I0930 19:57:59.584396 4553 patch_prober.go:28] interesting pod/machine-config-daemon-9n4dl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 19:57:59 crc kubenswrapper[4553]: I0930 19:57:59.584702 4553 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 19:57:59 crc kubenswrapper[4553]: I0930 19:57:59.584743 4553 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" Sep 30 19:57:59 crc kubenswrapper[4553]: I0930 19:57:59.585405 4553 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29"} pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 19:57:59 crc kubenswrapper[4553]: I0930 19:57:59.585482 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerName="machine-config-daemon" containerID="cri-o://4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" gracePeriod=600 Sep 30 19:57:59 crc kubenswrapper[4553]: E0930 19:57:59.719064 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:57:59 crc kubenswrapper[4553]: E0930 19:57:59.802901 4553 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e817c67_7688_42d4_8a82_ce72282cbb51.slice/crio-conmon-4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e817c67_7688_42d4_8a82_ce72282cbb51.slice/crio-4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29.scope\": RecentStats: unable to find data in memory cache]" Sep 30 19:58:00 crc kubenswrapper[4553]: I0930 19:58:00.181493 4553 generic.go:334] "Generic (PLEG): container finished" podID="1e817c67-7688-42d4-8a82-ce72282cbb51" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" exitCode=0 Sep 30 19:58:00 crc kubenswrapper[4553]: I0930 19:58:00.181546 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" event={"ID":"1e817c67-7688-42d4-8a82-ce72282cbb51","Type":"ContainerDied","Data":"4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29"} Sep 30 19:58:00 crc kubenswrapper[4553]: I0930 19:58:00.181870 4553 scope.go:117] "RemoveContainer" containerID="c7864ee52b427b57981d569d4ee7a9292f56eb6909fb29d851a6775585474b37" Sep 30 19:58:00 crc kubenswrapper[4553]: I0930 19:58:00.182497 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:58:00 crc kubenswrapper[4553]: E0930 19:58:00.182811 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:58:01 crc kubenswrapper[4553]: I0930 19:58:01.039053 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8qx8h"] Sep 30 19:58:01 crc kubenswrapper[4553]: I0930 19:58:01.061840 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8qx8h"] Sep 30 19:58:01 crc kubenswrapper[4553]: I0930 19:58:01.516657 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7810c768-948a-47c9-99e0-4b9c5c38f7ba" path="/var/lib/kubelet/pods/7810c768-948a-47c9-99e0-4b9c5c38f7ba/volumes" Sep 30 19:58:03 crc kubenswrapper[4553]: E0930 19:58:03.196978 4553 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.17:56462->38.102.83.17:43737: read tcp 38.102.83.17:56462->38.102.83.17:43737: read: connection reset by peer Sep 30 19:58:07 crc kubenswrapper[4553]: I0930 19:58:07.028256 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a1f4-account-create-vqppj"] Sep 30 19:58:07 crc kubenswrapper[4553]: I0930 19:58:07.036908 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vhp2l"] Sep 30 19:58:07 crc kubenswrapper[4553]: I0930 19:58:07.044652 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a1f4-account-create-vqppj"] Sep 30 19:58:07 crc kubenswrapper[4553]: I0930 19:58:07.052429 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vhp2l"] Sep 30 19:58:07 crc kubenswrapper[4553]: I0930 19:58:07.514880 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c5e64e4-7905-4524-bd33-8ab355eb2c90" path="/var/lib/kubelet/pods/5c5e64e4-7905-4524-bd33-8ab355eb2c90/volumes" Sep 30 19:58:07 crc kubenswrapper[4553]: I0930 19:58:07.515453 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8" path="/var/lib/kubelet/pods/cdcfeb4c-8e6b-4854-8ec4-a82942ca83a8/volumes" Sep 30 19:58:11 crc kubenswrapper[4553]: I0930 19:58:11.030816 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-446a-account-create-rdrh4"] Sep 30 19:58:11 crc kubenswrapper[4553]: I0930 19:58:11.038656 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-446a-account-create-rdrh4"] Sep 30 19:58:11 crc kubenswrapper[4553]: I0930 19:58:11.514787 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657cedd1-5a4e-4219-977b-92da68039989" path="/var/lib/kubelet/pods/657cedd1-5a4e-4219-977b-92da68039989/volumes" Sep 30 19:58:13 crc kubenswrapper[4553]: I0930 19:58:13.504300 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:58:13 crc kubenswrapper[4553]: E0930 19:58:13.504838 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:58:24 crc kubenswrapper[4553]: I0930 19:58:24.505498 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:58:24 crc kubenswrapper[4553]: E0930 19:58:24.506881 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:58:31 crc kubenswrapper[4553]: I0930 19:58:31.058350 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e21a-account-create-tqjjr"] Sep 30 19:58:31 crc kubenswrapper[4553]: I0930 19:58:31.074667 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e21a-account-create-tqjjr"] Sep 30 19:58:31 crc kubenswrapper[4553]: I0930 19:58:31.527379 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dac5b01-0adc-4d37-9dcb-707537a02cf0" path="/var/lib/kubelet/pods/8dac5b01-0adc-4d37-9dcb-707537a02cf0/volumes" Sep 30 19:58:35 crc kubenswrapper[4553]: I0930 19:58:35.503909 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:58:35 crc kubenswrapper[4553]: E0930 19:58:35.504747 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:58:36 crc kubenswrapper[4553]: I0930 19:58:36.049374 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6q88m"] Sep 30 19:58:36 crc kubenswrapper[4553]: I0930 19:58:36.049728 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6q88m"] Sep 30 19:58:37 crc kubenswrapper[4553]: I0930 19:58:37.053335 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-nvx5r"] Sep 30 19:58:37 crc kubenswrapper[4553]: I0930 19:58:37.073592 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qg8lx"] Sep 30 19:58:37 crc kubenswrapper[4553]: I0930 19:58:37.087208 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-nvx5r"] Sep 30 19:58:37 crc kubenswrapper[4553]: I0930 19:58:37.095426 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qg8lx"] Sep 30 19:58:37 crc kubenswrapper[4553]: I0930 19:58:37.530215 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0be7cfb-07fc-426f-a177-4199643cff46" path="/var/lib/kubelet/pods/b0be7cfb-07fc-426f-a177-4199643cff46/volumes" Sep 30 19:58:37 crc kubenswrapper[4553]: I0930 19:58:37.535450 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bd8102-b39f-40ee-b03d-9912adca9e41" path="/var/lib/kubelet/pods/d5bd8102-b39f-40ee-b03d-9912adca9e41/volumes" Sep 30 19:58:37 crc kubenswrapper[4553]: I0930 19:58:37.539499 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f3b7b5-90a0-44bc-9ba4-40729ffe3000" path="/var/lib/kubelet/pods/d8f3b7b5-90a0-44bc-9ba4-40729ffe3000/volumes" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.232035 4553 scope.go:117] "RemoveContainer" containerID="d84c03382c7aa10aa510431435cb9fe64a1ccaee7681445665778d72fab29d9e" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.270812 4553 scope.go:117] "RemoveContainer" containerID="a2314c63c3c97b40030a5904d6d4593b908a8a65f7aeda88b89ffa11914c35b3" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.340017 4553 scope.go:117] "RemoveContainer" containerID="a4ae681a0a75a849d4392af055327ef594180ee13088f923328005db87a42adf" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.402670 4553 scope.go:117] "RemoveContainer" containerID="e6ff23f280f5ea964e8ec5f0752a4d1cec1ec047b059abe13c4e5dd98098ad22" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.457777 4553 scope.go:117] "RemoveContainer" containerID="d7a98675df4d5aa33e7cfcebb4518656a0a4e1ab63b3ad4455b7514f01b890bb" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.516464 4553 scope.go:117] "RemoveContainer" containerID="cd66fe117268930bf59d7a82e1ac54ea688cff6521e461a1746a34b6d2aa95ea" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.551757 4553 scope.go:117] "RemoveContainer" containerID="4cf710ba1ea39cbd129d7ea179d80883ef83f5c8a8bc6410b2f2b1ecb7d68c30" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.576160 4553 scope.go:117] "RemoveContainer" containerID="20de24747c34af913a2ed98f51071a07f5cc4de34538842fce6d04a6a43c4aff" Sep 30 19:58:44 crc kubenswrapper[4553]: I0930 19:58:44.597410 4553 scope.go:117] "RemoveContainer" containerID="de578ec35b37d0019c9064fa6449537af89e39d16c09cf7bcfe883868a2c834a" Sep 30 19:58:46 crc kubenswrapper[4553]: I0930 19:58:46.505468 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:58:46 crc kubenswrapper[4553]: E0930 19:58:46.506285 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:58:53 crc kubenswrapper[4553]: I0930 19:58:53.050808 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-981f-account-create-68lnd"] Sep 30 19:58:53 crc kubenswrapper[4553]: I0930 19:58:53.060422 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-981f-account-create-68lnd"] Sep 30 19:58:53 crc kubenswrapper[4553]: I0930 19:58:53.070870 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e6e0-account-create-g2wlx"] Sep 30 19:58:53 crc kubenswrapper[4553]: I0930 19:58:53.077746 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e6e0-account-create-g2wlx"] Sep 30 19:58:53 crc kubenswrapper[4553]: I0930 19:58:53.531872 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2f1d2b7-12bf-4d45-b80e-712e015a61e5" path="/var/lib/kubelet/pods/b2f1d2b7-12bf-4d45-b80e-712e015a61e5/volumes" Sep 30 19:58:53 crc kubenswrapper[4553]: I0930 19:58:53.536299 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd42fc2-3f65-4d1f-a6bc-4c564f653f90" path="/var/lib/kubelet/pods/ccd42fc2-3f65-4d1f-a6bc-4c564f653f90/volumes" Sep 30 19:58:56 crc kubenswrapper[4553]: I0930 19:58:56.053801 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f77f-account-create-zqps4"] Sep 30 19:58:56 crc kubenswrapper[4553]: I0930 19:58:56.063574 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f77f-account-create-zqps4"] Sep 30 19:58:57 crc kubenswrapper[4553]: I0930 19:58:57.518677 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:58:57 crc kubenswrapper[4553]: E0930 19:58:57.521590 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:58:57 crc kubenswrapper[4553]: I0930 19:58:57.533571 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc06475-3808-4387-8427-c82f3f77ba73" path="/var/lib/kubelet/pods/ecc06475-3808-4387-8427-c82f3f77ba73/volumes" Sep 30 19:58:58 crc kubenswrapper[4553]: I0930 19:58:58.045659 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-hsmcl"] Sep 30 19:58:58 crc kubenswrapper[4553]: I0930 19:58:58.068537 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-hsmcl"] Sep 30 19:58:59 crc kubenswrapper[4553]: I0930 19:58:59.527571 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f158e70-9924-417e-b100-983f574bef9a" path="/var/lib/kubelet/pods/3f158e70-9924-417e-b100-983f574bef9a/volumes" Sep 30 19:59:03 crc kubenswrapper[4553]: I0930 19:59:03.037283 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gw5ch"] Sep 30 19:59:03 crc kubenswrapper[4553]: I0930 19:59:03.046907 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gw5ch"] Sep 30 19:59:03 crc kubenswrapper[4553]: I0930 19:59:03.527739 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9a8e95-e61a-473d-a74f-cf7a6820ff97" path="/var/lib/kubelet/pods/3f9a8e95-e61a-473d-a74f-cf7a6820ff97/volumes" Sep 30 19:59:11 crc kubenswrapper[4553]: I0930 19:59:11.504923 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:59:11 crc kubenswrapper[4553]: E0930 19:59:11.506022 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:59:15 crc kubenswrapper[4553]: I0930 19:59:15.094202 4553 generic.go:334] "Generic (PLEG): container finished" podID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerID="32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1" exitCode=0 Sep 30 19:59:15 crc kubenswrapper[4553]: I0930 19:59:15.094326 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4hfkh/must-gather-jw59q" event={"ID":"5f02ec77-88fd-40c9-8b0f-085d34da84f7","Type":"ContainerDied","Data":"32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1"} Sep 30 19:59:15 crc kubenswrapper[4553]: I0930 19:59:15.096122 4553 scope.go:117] "RemoveContainer" containerID="32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1" Sep 30 19:59:15 crc kubenswrapper[4553]: I0930 19:59:15.965586 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hfkh_must-gather-jw59q_5f02ec77-88fd-40c9-8b0f-085d34da84f7/gather/0.log" Sep 30 19:59:22 crc kubenswrapper[4553]: I0930 19:59:22.504962 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:59:22 crc kubenswrapper[4553]: E0930 19:59:22.505957 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:59:24 crc kubenswrapper[4553]: I0930 19:59:24.314054 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4hfkh/must-gather-jw59q"] Sep 30 19:59:24 crc kubenswrapper[4553]: I0930 19:59:24.315467 4553 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4hfkh/must-gather-jw59q" podUID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerName="copy" containerID="cri-o://ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e" gracePeriod=2 Sep 30 19:59:24 crc kubenswrapper[4553]: I0930 19:59:24.337826 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4hfkh/must-gather-jw59q"] Sep 30 19:59:24 crc kubenswrapper[4553]: I0930 19:59:24.883903 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hfkh_must-gather-jw59q_5f02ec77-88fd-40c9-8b0f-085d34da84f7/copy/0.log" Sep 30 19:59:24 crc kubenswrapper[4553]: I0930 19:59:24.884715 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:59:24 crc kubenswrapper[4553]: I0930 19:59:24.947220 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbmvk\" (UniqueName: \"kubernetes.io/projected/5f02ec77-88fd-40c9-8b0f-085d34da84f7-kube-api-access-dbmvk\") pod \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\" (UID: \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\") " Sep 30 19:59:24 crc kubenswrapper[4553]: I0930 19:59:24.947410 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5f02ec77-88fd-40c9-8b0f-085d34da84f7-must-gather-output\") pod \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\" (UID: \"5f02ec77-88fd-40c9-8b0f-085d34da84f7\") " Sep 30 19:59:24 crc kubenswrapper[4553]: I0930 19:59:24.952491 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f02ec77-88fd-40c9-8b0f-085d34da84f7-kube-api-access-dbmvk" (OuterVolumeSpecName: "kube-api-access-dbmvk") pod "5f02ec77-88fd-40c9-8b0f-085d34da84f7" (UID: "5f02ec77-88fd-40c9-8b0f-085d34da84f7"). InnerVolumeSpecName "kube-api-access-dbmvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.049162 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbmvk\" (UniqueName: \"kubernetes.io/projected/5f02ec77-88fd-40c9-8b0f-085d34da84f7-kube-api-access-dbmvk\") on node \"crc\" DevicePath \"\"" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.065175 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f02ec77-88fd-40c9-8b0f-085d34da84f7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5f02ec77-88fd-40c9-8b0f-085d34da84f7" (UID: "5f02ec77-88fd-40c9-8b0f-085d34da84f7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.150625 4553 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5f02ec77-88fd-40c9-8b0f-085d34da84f7-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.202506 4553 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4hfkh_must-gather-jw59q_5f02ec77-88fd-40c9-8b0f-085d34da84f7/copy/0.log" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.203087 4553 scope.go:117] "RemoveContainer" containerID="ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.203108 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4hfkh/must-gather-jw59q" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.203030 4553 generic.go:334] "Generic (PLEG): container finished" podID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerID="ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e" exitCode=143 Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.231562 4553 scope.go:117] "RemoveContainer" containerID="32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.291239 4553 scope.go:117] "RemoveContainer" containerID="ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e" Sep 30 19:59:25 crc kubenswrapper[4553]: E0930 19:59:25.293560 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e\": container with ID starting with ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e not found: ID does not exist" containerID="ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.293595 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e"} err="failed to get container status \"ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e\": rpc error: code = NotFound desc = could not find container \"ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e\": container with ID starting with ef9b42a2d8b2c7421ea1541f10f5b47a95642daba9bece0e87ff66bd74683e5e not found: ID does not exist" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.293617 4553 scope.go:117] "RemoveContainer" containerID="32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1" Sep 30 19:59:25 crc kubenswrapper[4553]: E0930 19:59:25.293893 4553 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1\": container with ID starting with 32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1 not found: ID does not exist" containerID="32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.293909 4553 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1"} err="failed to get container status \"32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1\": rpc error: code = NotFound desc = could not find container \"32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1\": container with ID starting with 32fef0e1b3c13456c6e8f98918416ec3cbce154c7b4f1438f7a4fac930ffa1a1 not found: ID does not exist" Sep 30 19:59:25 crc kubenswrapper[4553]: I0930 19:59:25.512618 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" path="/var/lib/kubelet/pods/5f02ec77-88fd-40c9-8b0f-085d34da84f7/volumes" Sep 30 19:59:36 crc kubenswrapper[4553]: I0930 19:59:36.504802 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:59:36 crc kubenswrapper[4553]: E0930 19:59:36.505693 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:59:43 crc kubenswrapper[4553]: I0930 19:59:43.032375 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zsslz"] Sep 30 19:59:43 crc kubenswrapper[4553]: I0930 19:59:43.042440 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zsslz"] Sep 30 19:59:43 crc kubenswrapper[4553]: I0930 19:59:43.513611 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c9fecb-7dc9-4aed-b134-98995f1cf280" path="/var/lib/kubelet/pods/08c9fecb-7dc9-4aed-b134-98995f1cf280/volumes" Sep 30 19:59:44 crc kubenswrapper[4553]: I0930 19:59:44.034992 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pk229"] Sep 30 19:59:44 crc kubenswrapper[4553]: I0930 19:59:44.044437 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-f7rgm"] Sep 30 19:59:44 crc kubenswrapper[4553]: I0930 19:59:44.050550 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-f7rgm"] Sep 30 19:59:44 crc kubenswrapper[4553]: I0930 19:59:44.057722 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pk229"] Sep 30 19:59:44 crc kubenswrapper[4553]: I0930 19:59:44.835159 4553 scope.go:117] "RemoveContainer" containerID="e37890e7b19c63a12e765509ea26c067a0fbd538f20814008b4129de1e200df9" Sep 30 19:59:44 crc kubenswrapper[4553]: I0930 19:59:44.899483 4553 scope.go:117] "RemoveContainer" containerID="fdb76c3419c546d756a94229f0a6eb6009114b7423d2675741c8614ce922dcad" Sep 30 19:59:44 crc kubenswrapper[4553]: I0930 19:59:44.959687 4553 scope.go:117] "RemoveContainer" containerID="501b490f06d27e303698db93df1d2d24e55b73d4d54b9e859a97d85996b56eec" Sep 30 19:59:44 crc kubenswrapper[4553]: I0930 19:59:44.985754 4553 scope.go:117] "RemoveContainer" containerID="46c9754aed86dab703e791c801f15b4797165dc388f19abeb54b3cec5780cd31" Sep 30 19:59:45 crc kubenswrapper[4553]: I0930 19:59:45.041720 4553 scope.go:117] "RemoveContainer" containerID="7afad2e950debbaf9e8d1794c09604135264a240fed5f9dbf01384cea93133a5" Sep 30 19:59:45 crc kubenswrapper[4553]: I0930 19:59:45.090106 4553 scope.go:117] "RemoveContainer" containerID="8b86fa732ebe460ff0df6d1a947d9d597f10f8fc2044fc5328f14860e3a3852c" Sep 30 19:59:45 crc kubenswrapper[4553]: I0930 19:59:45.525368 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2633e01b-c518-4077-af93-7ba213150186" path="/var/lib/kubelet/pods/2633e01b-c518-4077-af93-7ba213150186/volumes" Sep 30 19:59:45 crc kubenswrapper[4553]: I0930 19:59:45.527068 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1bf2fc0-8737-4258-9bf8-1978001043f9" path="/var/lib/kubelet/pods/d1bf2fc0-8737-4258-9bf8-1978001043f9/volumes" Sep 30 19:59:49 crc kubenswrapper[4553]: I0930 19:59:49.506014 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 19:59:49 crc kubenswrapper[4553]: E0930 19:59:49.507099 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 19:59:58 crc kubenswrapper[4553]: I0930 19:59:58.077973 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-prf67"] Sep 30 19:59:58 crc kubenswrapper[4553]: I0930 19:59:58.090974 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-prf67"] Sep 30 19:59:59 crc kubenswrapper[4553]: I0930 19:59:59.039665 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-k52t8"] Sep 30 19:59:59 crc kubenswrapper[4553]: I0930 19:59:59.042701 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-k52t8"] Sep 30 19:59:59 crc kubenswrapper[4553]: I0930 19:59:59.519687 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f1abd5-5975-4038-98b3-4b6ff0e858f7" path="/var/lib/kubelet/pods/04f1abd5-5975-4038-98b3-4b6ff0e858f7/volumes" Sep 30 19:59:59 crc kubenswrapper[4553]: I0930 19:59:59.522021 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9958ea9-408e-4b14-8b23-dd1662654cd1" path="/var/lib/kubelet/pods/c9958ea9-408e-4b14-8b23-dd1662654cd1/volumes" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.187221 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d"] Sep 30 20:00:00 crc kubenswrapper[4553]: E0930 20:00:00.187815 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerName="gather" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.187838 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerName="gather" Sep 30 20:00:00 crc kubenswrapper[4553]: E0930 20:00:00.187869 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerName="copy" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.187881 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerName="copy" Sep 30 20:00:00 crc kubenswrapper[4553]: E0930 20:00:00.187894 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" containerName="extract-utilities" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.187903 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" containerName="extract-utilities" Sep 30 20:00:00 crc kubenswrapper[4553]: E0930 20:00:00.187921 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" containerName="extract-content" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.187932 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" containerName="extract-content" Sep 30 20:00:00 crc kubenswrapper[4553]: E0930 20:00:00.187955 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" containerName="registry-server" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.187964 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" containerName="registry-server" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.188284 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3251305-58ed-4b31-a523-66ba99240ec1" containerName="registry-server" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.188310 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerName="copy" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.188338 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f02ec77-88fd-40c9-8b0f-085d34da84f7" containerName="gather" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.189207 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.192301 4553 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.194323 4553 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.221361 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d"] Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.373158 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e629896f-1f6e-4759-ad61-7e555ffb2387-config-volume\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.373232 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e629896f-1f6e-4759-ad61-7e555ffb2387-secret-volume\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.373464 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjq6t\" (UniqueName: \"kubernetes.io/projected/e629896f-1f6e-4759-ad61-7e555ffb2387-kube-api-access-fjq6t\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.477927 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjq6t\" (UniqueName: \"kubernetes.io/projected/e629896f-1f6e-4759-ad61-7e555ffb2387-kube-api-access-fjq6t\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.478220 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e629896f-1f6e-4759-ad61-7e555ffb2387-config-volume\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.478612 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e629896f-1f6e-4759-ad61-7e555ffb2387-secret-volume\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.479127 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e629896f-1f6e-4759-ad61-7e555ffb2387-config-volume\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.500839 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjq6t\" (UniqueName: \"kubernetes.io/projected/e629896f-1f6e-4759-ad61-7e555ffb2387-kube-api-access-fjq6t\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.507121 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e629896f-1f6e-4759-ad61-7e555ffb2387-secret-volume\") pod \"collect-profiles-29321040-lkx6d\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:00 crc kubenswrapper[4553]: I0930 20:00:00.509393 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:01 crc kubenswrapper[4553]: I0930 20:00:01.012356 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d"] Sep 30 20:00:01 crc kubenswrapper[4553]: I0930 20:00:01.615550 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" event={"ID":"e629896f-1f6e-4759-ad61-7e555ffb2387","Type":"ContainerStarted","Data":"7e86e60749bc3fde7ce0c4e2dcc4896518652fed96b21e32f0217124d454f9e6"} Sep 30 20:00:01 crc kubenswrapper[4553]: I0930 20:00:01.615934 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" event={"ID":"e629896f-1f6e-4759-ad61-7e555ffb2387","Type":"ContainerStarted","Data":"0567471f30e8f53fd14b66d88a5626eefdf72612c3d9faea54054e67c38c9a52"} Sep 30 20:00:01 crc kubenswrapper[4553]: I0930 20:00:01.642797 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" podStartSLOduration=1.642774943 podStartE2EDuration="1.642774943s" podCreationTimestamp="2025-09-30 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 20:00:01.637907792 +0000 UTC m=+1654.837409932" watchObservedRunningTime="2025-09-30 20:00:01.642774943 +0000 UTC m=+1654.842277093" Sep 30 20:00:02 crc kubenswrapper[4553]: I0930 20:00:02.504646 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 20:00:02 crc kubenswrapper[4553]: E0930 20:00:02.505397 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 20:00:02 crc kubenswrapper[4553]: I0930 20:00:02.635237 4553 generic.go:334] "Generic (PLEG): container finished" podID="e629896f-1f6e-4759-ad61-7e555ffb2387" containerID="7e86e60749bc3fde7ce0c4e2dcc4896518652fed96b21e32f0217124d454f9e6" exitCode=0 Sep 30 20:00:02 crc kubenswrapper[4553]: I0930 20:00:02.635282 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" event={"ID":"e629896f-1f6e-4759-ad61-7e555ffb2387","Type":"ContainerDied","Data":"7e86e60749bc3fde7ce0c4e2dcc4896518652fed96b21e32f0217124d454f9e6"} Sep 30 20:00:03 crc kubenswrapper[4553]: I0930 20:00:03.976313 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.156567 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e629896f-1f6e-4759-ad61-7e555ffb2387-config-volume\") pod \"e629896f-1f6e-4759-ad61-7e555ffb2387\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.156685 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e629896f-1f6e-4759-ad61-7e555ffb2387-secret-volume\") pod \"e629896f-1f6e-4759-ad61-7e555ffb2387\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.156893 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjq6t\" (UniqueName: \"kubernetes.io/projected/e629896f-1f6e-4759-ad61-7e555ffb2387-kube-api-access-fjq6t\") pod \"e629896f-1f6e-4759-ad61-7e555ffb2387\" (UID: \"e629896f-1f6e-4759-ad61-7e555ffb2387\") " Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.158301 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e629896f-1f6e-4759-ad61-7e555ffb2387-config-volume" (OuterVolumeSpecName: "config-volume") pod "e629896f-1f6e-4759-ad61-7e555ffb2387" (UID: "e629896f-1f6e-4759-ad61-7e555ffb2387"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.165017 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e629896f-1f6e-4759-ad61-7e555ffb2387-kube-api-access-fjq6t" (OuterVolumeSpecName: "kube-api-access-fjq6t") pod "e629896f-1f6e-4759-ad61-7e555ffb2387" (UID: "e629896f-1f6e-4759-ad61-7e555ffb2387"). InnerVolumeSpecName "kube-api-access-fjq6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.172206 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e629896f-1f6e-4759-ad61-7e555ffb2387-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e629896f-1f6e-4759-ad61-7e555ffb2387" (UID: "e629896f-1f6e-4759-ad61-7e555ffb2387"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.260225 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjq6t\" (UniqueName: \"kubernetes.io/projected/e629896f-1f6e-4759-ad61-7e555ffb2387-kube-api-access-fjq6t\") on node \"crc\" DevicePath \"\"" Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.260276 4553 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e629896f-1f6e-4759-ad61-7e555ffb2387-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.260295 4553 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e629896f-1f6e-4759-ad61-7e555ffb2387-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.652814 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" event={"ID":"e629896f-1f6e-4759-ad61-7e555ffb2387","Type":"ContainerDied","Data":"0567471f30e8f53fd14b66d88a5626eefdf72612c3d9faea54054e67c38c9a52"} Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.653144 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0567471f30e8f53fd14b66d88a5626eefdf72612c3d9faea54054e67c38c9a52" Sep 30 20:00:04 crc kubenswrapper[4553]: I0930 20:00:04.652855 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29321040-lkx6d" Sep 30 20:00:15 crc kubenswrapper[4553]: I0930 20:00:15.506423 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 20:00:15 crc kubenswrapper[4553]: E0930 20:00:15.507449 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 20:00:28 crc kubenswrapper[4553]: I0930 20:00:28.504559 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 20:00:28 crc kubenswrapper[4553]: E0930 20:00:28.505734 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 20:00:39 crc kubenswrapper[4553]: I0930 20:00:39.504886 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 20:00:39 crc kubenswrapper[4553]: E0930 20:00:39.506319 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 20:00:45 crc kubenswrapper[4553]: I0930 20:00:45.254500 4553 scope.go:117] "RemoveContainer" containerID="e723bb837bff29cbcd7be40ca76e68a47b2270b22b4ad9ed4b84c32865f45688" Sep 30 20:00:45 crc kubenswrapper[4553]: I0930 20:00:45.309637 4553 scope.go:117] "RemoveContainer" containerID="5b3b48e5cc114e37b82b361a145372fc813009d0a4276f9be4bd1c815092a7b5" Sep 30 20:00:45 crc kubenswrapper[4553]: I0930 20:00:45.375748 4553 scope.go:117] "RemoveContainer" containerID="c35f71ed62ab9c4849e25d2f14da54779822bd9438fec58dbc0c4ac04c0373ed" Sep 30 20:00:45 crc kubenswrapper[4553]: I0930 20:00:45.414628 4553 scope.go:117] "RemoveContainer" containerID="ad622983d2e249b22159fc9ce9068573954aa8aeaf612e174db930d57a095e88" Sep 30 20:00:52 crc kubenswrapper[4553]: I0930 20:00:52.505530 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 20:00:52 crc kubenswrapper[4553]: E0930 20:00:52.506396 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.043471 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xm47s"] Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.050289 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5p4j4"] Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.055874 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5p4j4"] Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.063301 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xm47s"] Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.069560 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f7n8l"] Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.075667 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f7n8l"] Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.146777 4553 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29321041-wvhrx"] Sep 30 20:01:00 crc kubenswrapper[4553]: E0930 20:01:00.147178 4553 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e629896f-1f6e-4759-ad61-7e555ffb2387" containerName="collect-profiles" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.147197 4553 state_mem.go:107] "Deleted CPUSet assignment" podUID="e629896f-1f6e-4759-ad61-7e555ffb2387" containerName="collect-profiles" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.147375 4553 memory_manager.go:354] "RemoveStaleState removing state" podUID="e629896f-1f6e-4759-ad61-7e555ffb2387" containerName="collect-profiles" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.148018 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.161860 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29321041-wvhrx"] Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.331776 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vls85\" (UniqueName: \"kubernetes.io/projected/134bd21e-15e9-429a-8d2d-21fd2357667b-kube-api-access-vls85\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.331849 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-combined-ca-bundle\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.332146 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-fernet-keys\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.332251 4553 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-config-data\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.433542 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-fernet-keys\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.433823 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-config-data\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.433941 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vls85\" (UniqueName: \"kubernetes.io/projected/134bd21e-15e9-429a-8d2d-21fd2357667b-kube-api-access-vls85\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.434113 4553 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-combined-ca-bundle\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.440753 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-fernet-keys\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.441448 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-config-data\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.446819 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-combined-ca-bundle\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.464603 4553 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vls85\" (UniqueName: \"kubernetes.io/projected/134bd21e-15e9-429a-8d2d-21fd2357667b-kube-api-access-vls85\") pod \"keystone-cron-29321041-wvhrx\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.471583 4553 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:00 crc kubenswrapper[4553]: I0930 20:01:00.975601 4553 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29321041-wvhrx"] Sep 30 20:01:01 crc kubenswrapper[4553]: I0930 20:01:01.228653 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29321041-wvhrx" event={"ID":"134bd21e-15e9-429a-8d2d-21fd2357667b","Type":"ContainerStarted","Data":"b4b419203a0752a46b8edd77b17bf3803e969aabe77f77010cafe256a91c2295"} Sep 30 20:01:01 crc kubenswrapper[4553]: I0930 20:01:01.228979 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29321041-wvhrx" event={"ID":"134bd21e-15e9-429a-8d2d-21fd2357667b","Type":"ContainerStarted","Data":"35ab35473864536d91e29c05fedfbba28c15fb887777b3d980cea8cb0319fc03"} Sep 30 20:01:01 crc kubenswrapper[4553]: I0930 20:01:01.276876 4553 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29321041-wvhrx" podStartSLOduration=1.27685121 podStartE2EDuration="1.27685121s" podCreationTimestamp="2025-09-30 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 20:01:01.26457452 +0000 UTC m=+1714.464076710" watchObservedRunningTime="2025-09-30 20:01:01.27685121 +0000 UTC m=+1714.476353340" Sep 30 20:01:01 crc kubenswrapper[4553]: I0930 20:01:01.523839 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78299d5b-ca49-4bfa-a23e-c81671ab07da" path="/var/lib/kubelet/pods/78299d5b-ca49-4bfa-a23e-c81671ab07da/volumes" Sep 30 20:01:01 crc kubenswrapper[4553]: I0930 20:01:01.524399 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60eae4e-80e4-4f1d-b7d9-7b498649fa67" path="/var/lib/kubelet/pods/b60eae4e-80e4-4f1d-b7d9-7b498649fa67/volumes" Sep 30 20:01:01 crc kubenswrapper[4553]: I0930 20:01:01.549836 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b972cf08-0eee-4970-8825-a313fdddc23a" path="/var/lib/kubelet/pods/b972cf08-0eee-4970-8825-a313fdddc23a/volumes" Sep 30 20:01:04 crc kubenswrapper[4553]: I0930 20:01:04.259510 4553 generic.go:334] "Generic (PLEG): container finished" podID="134bd21e-15e9-429a-8d2d-21fd2357667b" containerID="b4b419203a0752a46b8edd77b17bf3803e969aabe77f77010cafe256a91c2295" exitCode=0 Sep 30 20:01:04 crc kubenswrapper[4553]: I0930 20:01:04.259610 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29321041-wvhrx" event={"ID":"134bd21e-15e9-429a-8d2d-21fd2357667b","Type":"ContainerDied","Data":"b4b419203a0752a46b8edd77b17bf3803e969aabe77f77010cafe256a91c2295"} Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.666873 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.838018 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vls85\" (UniqueName: \"kubernetes.io/projected/134bd21e-15e9-429a-8d2d-21fd2357667b-kube-api-access-vls85\") pod \"134bd21e-15e9-429a-8d2d-21fd2357667b\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.838139 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-config-data\") pod \"134bd21e-15e9-429a-8d2d-21fd2357667b\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.838187 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-combined-ca-bundle\") pod \"134bd21e-15e9-429a-8d2d-21fd2357667b\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.838214 4553 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-fernet-keys\") pod \"134bd21e-15e9-429a-8d2d-21fd2357667b\" (UID: \"134bd21e-15e9-429a-8d2d-21fd2357667b\") " Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.844477 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "134bd21e-15e9-429a-8d2d-21fd2357667b" (UID: "134bd21e-15e9-429a-8d2d-21fd2357667b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.846599 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134bd21e-15e9-429a-8d2d-21fd2357667b-kube-api-access-vls85" (OuterVolumeSpecName: "kube-api-access-vls85") pod "134bd21e-15e9-429a-8d2d-21fd2357667b" (UID: "134bd21e-15e9-429a-8d2d-21fd2357667b"). InnerVolumeSpecName "kube-api-access-vls85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.873710 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "134bd21e-15e9-429a-8d2d-21fd2357667b" (UID: "134bd21e-15e9-429a-8d2d-21fd2357667b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.907529 4553 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-config-data" (OuterVolumeSpecName: "config-data") pod "134bd21e-15e9-429a-8d2d-21fd2357667b" (UID: "134bd21e-15e9-429a-8d2d-21fd2357667b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.940304 4553 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.940370 4553 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.940385 4553 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/134bd21e-15e9-429a-8d2d-21fd2357667b-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 20:01:05 crc kubenswrapper[4553]: I0930 20:01:05.940396 4553 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vls85\" (UniqueName: \"kubernetes.io/projected/134bd21e-15e9-429a-8d2d-21fd2357667b-kube-api-access-vls85\") on node \"crc\" DevicePath \"\"" Sep 30 20:01:06 crc kubenswrapper[4553]: I0930 20:01:06.280289 4553 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29321041-wvhrx" event={"ID":"134bd21e-15e9-429a-8d2d-21fd2357667b","Type":"ContainerDied","Data":"35ab35473864536d91e29c05fedfbba28c15fb887777b3d980cea8cb0319fc03"} Sep 30 20:01:06 crc kubenswrapper[4553]: I0930 20:01:06.280327 4553 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29321041-wvhrx" Sep 30 20:01:06 crc kubenswrapper[4553]: I0930 20:01:06.280349 4553 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35ab35473864536d91e29c05fedfbba28c15fb887777b3d980cea8cb0319fc03" Sep 30 20:01:06 crc kubenswrapper[4553]: I0930 20:01:06.504181 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 20:01:06 crc kubenswrapper[4553]: E0930 20:01:06.504397 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 20:01:09 crc kubenswrapper[4553]: I0930 20:01:09.040247 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6d09-account-create-rf9p6"] Sep 30 20:01:09 crc kubenswrapper[4553]: I0930 20:01:09.054922 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6d09-account-create-rf9p6"] Sep 30 20:01:09 crc kubenswrapper[4553]: I0930 20:01:09.522816 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31b3010c-e679-4828-b02a-7c89c82d6f17" path="/var/lib/kubelet/pods/31b3010c-e679-4828-b02a-7c89c82d6f17/volumes" Sep 30 20:01:10 crc kubenswrapper[4553]: I0930 20:01:10.030133 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8fd0-account-create-4j249"] Sep 30 20:01:10 crc kubenswrapper[4553]: I0930 20:01:10.041227 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8fd0-account-create-4j249"] Sep 30 20:01:11 crc kubenswrapper[4553]: I0930 20:01:11.525427 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="230810df-34fe-4a09-bf1a-ab53ba9faef4" path="/var/lib/kubelet/pods/230810df-34fe-4a09-bf1a-ab53ba9faef4/volumes" Sep 30 20:01:18 crc kubenswrapper[4553]: I0930 20:01:18.503921 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 20:01:18 crc kubenswrapper[4553]: E0930 20:01:18.504793 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 20:01:22 crc kubenswrapper[4553]: I0930 20:01:22.057758 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-eee0-account-create-x9tl2"] Sep 30 20:01:22 crc kubenswrapper[4553]: I0930 20:01:22.066796 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-eee0-account-create-x9tl2"] Sep 30 20:01:23 crc kubenswrapper[4553]: I0930 20:01:23.520402 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82733d90-45f9-482e-a453-3b52a14b064e" path="/var/lib/kubelet/pods/82733d90-45f9-482e-a453-3b52a14b064e/volumes" Sep 30 20:01:31 crc kubenswrapper[4553]: I0930 20:01:31.043533 4553 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndtgv"] Sep 30 20:01:31 crc kubenswrapper[4553]: I0930 20:01:31.056338 4553 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ndtgv"] Sep 30 20:01:31 crc kubenswrapper[4553]: I0930 20:01:31.506763 4553 scope.go:117] "RemoveContainer" containerID="4bb2cfb9da8db933d6501d51d8213616d799949324bc25b4ec124f839ff3ab29" Sep 30 20:01:31 crc kubenswrapper[4553]: E0930 20:01:31.507147 4553 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-9n4dl_openshift-machine-config-operator(1e817c67-7688-42d4-8a82-ce72282cbb51)\"" pod="openshift-machine-config-operator/machine-config-daemon-9n4dl" podUID="1e817c67-7688-42d4-8a82-ce72282cbb51" Sep 30 20:01:31 crc kubenswrapper[4553]: I0930 20:01:31.516783 4553 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6616a935-12f1-4f60-a206-1dbcfd9a6400" path="/var/lib/kubelet/pods/6616a935-12f1-4f60-a206-1dbcfd9a6400/volumes"